Jan 21 09:03:20 crc systemd[1]: Starting Kubernetes Kubelet... Jan 21 09:03:20 crc restorecon[4572]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:20 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 09:03:21 crc restorecon[4572]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 21 09:03:21 crc kubenswrapper[4618]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.422280 4618 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426170 4618 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426195 4618 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426200 4618 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426204 4618 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426208 4618 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426212 4618 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426216 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426219 4618 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426222 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426225 4618 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426230 4618 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426234 4618 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426237 4618 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426240 4618 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426243 4618 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426247 4618 feature_gate.go:330] unrecognized feature gate: Example Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426250 4618 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426254 4618 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426257 4618 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426260 4618 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426263 4618 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426266 4618 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426269 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426272 4618 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426276 4618 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426279 4618 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426282 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426285 4618 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426288 4618 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426292 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426295 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426298 4618 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426302 4618 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426305 4618 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426308 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426311 4618 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426315 4618 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426318 4618 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426322 4618 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426325 4618 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426329 4618 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426335 4618 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426339 4618 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426342 4618 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426346 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426349 4618 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426354 4618 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426357 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426361 4618 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426364 4618 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426368 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426372 4618 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426375 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426379 4618 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426382 4618 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426387 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426390 4618 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426393 4618 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426397 4618 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426401 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426404 4618 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426408 4618 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426411 4618 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426414 4618 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426417 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426420 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426423 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426427 4618 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426431 4618 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426434 4618 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.426437 4618 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426531 4618 flags.go:64] FLAG: --address="0.0.0.0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426541 4618 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426549 4618 flags.go:64] FLAG: --anonymous-auth="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426554 4618 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426560 4618 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426564 4618 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426568 4618 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426573 4618 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426577 4618 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426581 4618 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426585 4618 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426590 4618 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426594 4618 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426598 4618 flags.go:64] FLAG: --cgroup-root="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426601 4618 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426605 4618 flags.go:64] FLAG: --client-ca-file="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426608 4618 flags.go:64] FLAG: --cloud-config="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426613 4618 flags.go:64] FLAG: --cloud-provider="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426616 4618 flags.go:64] FLAG: --cluster-dns="[]" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426621 4618 flags.go:64] FLAG: --cluster-domain="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426624 4618 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426628 4618 flags.go:64] FLAG: --config-dir="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426632 4618 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426636 4618 flags.go:64] FLAG: --container-log-max-files="5" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426641 4618 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426644 4618 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426648 4618 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426652 4618 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426655 4618 flags.go:64] FLAG: --contention-profiling="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426659 4618 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426663 4618 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426668 4618 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426671 4618 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426676 4618 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426680 4618 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426684 4618 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426687 4618 flags.go:64] FLAG: --enable-load-reader="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426691 4618 flags.go:64] FLAG: --enable-server="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426695 4618 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426699 4618 flags.go:64] FLAG: --event-burst="100" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426703 4618 flags.go:64] FLAG: --event-qps="50" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426707 4618 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426711 4618 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426716 4618 flags.go:64] FLAG: --eviction-hard="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426721 4618 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426725 4618 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426728 4618 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426733 4618 flags.go:64] FLAG: --eviction-soft="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426737 4618 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426740 4618 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426744 4618 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426747 4618 flags.go:64] FLAG: --experimental-mounter-path="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426751 4618 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426755 4618 flags.go:64] FLAG: --fail-swap-on="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426758 4618 flags.go:64] FLAG: --feature-gates="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426763 4618 flags.go:64] FLAG: --file-check-frequency="20s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426767 4618 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426771 4618 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426774 4618 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426778 4618 flags.go:64] FLAG: --healthz-port="10248" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426781 4618 flags.go:64] FLAG: --help="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426785 4618 flags.go:64] FLAG: --hostname-override="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426788 4618 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426792 4618 flags.go:64] FLAG: --http-check-frequency="20s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426796 4618 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426800 4618 flags.go:64] FLAG: --image-credential-provider-config="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426803 4618 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426807 4618 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426811 4618 flags.go:64] FLAG: --image-service-endpoint="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426814 4618 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426818 4618 flags.go:64] FLAG: --kube-api-burst="100" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426821 4618 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426825 4618 flags.go:64] FLAG: --kube-api-qps="50" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426829 4618 flags.go:64] FLAG: --kube-reserved="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426833 4618 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426836 4618 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426840 4618 flags.go:64] FLAG: --kubelet-cgroups="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426843 4618 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426846 4618 flags.go:64] FLAG: --lock-file="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426850 4618 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426853 4618 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426857 4618 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426863 4618 flags.go:64] FLAG: --log-json-split-stream="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426867 4618 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426871 4618 flags.go:64] FLAG: --log-text-split-stream="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426874 4618 flags.go:64] FLAG: --logging-format="text" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426878 4618 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426882 4618 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426885 4618 flags.go:64] FLAG: --manifest-url="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426889 4618 flags.go:64] FLAG: --manifest-url-header="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426895 4618 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426898 4618 flags.go:64] FLAG: --max-open-files="1000000" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426903 4618 flags.go:64] FLAG: --max-pods="110" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426907 4618 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426910 4618 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426914 4618 flags.go:64] FLAG: --memory-manager-policy="None" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426918 4618 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426922 4618 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426925 4618 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426929 4618 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426938 4618 flags.go:64] FLAG: --node-status-max-images="50" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426942 4618 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426945 4618 flags.go:64] FLAG: --oom-score-adj="-999" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426949 4618 flags.go:64] FLAG: --pod-cidr="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426952 4618 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426958 4618 flags.go:64] FLAG: --pod-manifest-path="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426962 4618 flags.go:64] FLAG: --pod-max-pids="-1" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426965 4618 flags.go:64] FLAG: --pods-per-core="0" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426969 4618 flags.go:64] FLAG: --port="10250" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426973 4618 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426976 4618 flags.go:64] FLAG: --provider-id="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426979 4618 flags.go:64] FLAG: --qos-reserved="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426983 4618 flags.go:64] FLAG: --read-only-port="10255" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426987 4618 flags.go:64] FLAG: --register-node="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426990 4618 flags.go:64] FLAG: --register-schedulable="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.426994 4618 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427000 4618 flags.go:64] FLAG: --registry-burst="10" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427004 4618 flags.go:64] FLAG: --registry-qps="5" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427007 4618 flags.go:64] FLAG: --reserved-cpus="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427013 4618 flags.go:64] FLAG: --reserved-memory="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427017 4618 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427021 4618 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427025 4618 flags.go:64] FLAG: --rotate-certificates="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427029 4618 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427032 4618 flags.go:64] FLAG: --runonce="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427036 4618 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427040 4618 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427044 4618 flags.go:64] FLAG: --seccomp-default="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427047 4618 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427051 4618 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427055 4618 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427059 4618 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427062 4618 flags.go:64] FLAG: --storage-driver-password="root" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427066 4618 flags.go:64] FLAG: --storage-driver-secure="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427069 4618 flags.go:64] FLAG: --storage-driver-table="stats" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427073 4618 flags.go:64] FLAG: --storage-driver-user="root" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427077 4618 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427081 4618 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427085 4618 flags.go:64] FLAG: --system-cgroups="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427089 4618 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427095 4618 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427099 4618 flags.go:64] FLAG: --tls-cert-file="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427102 4618 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427108 4618 flags.go:64] FLAG: --tls-min-version="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427111 4618 flags.go:64] FLAG: --tls-private-key-file="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427115 4618 flags.go:64] FLAG: --topology-manager-policy="none" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427118 4618 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427122 4618 flags.go:64] FLAG: --topology-manager-scope="container" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427126 4618 flags.go:64] FLAG: --v="2" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427131 4618 flags.go:64] FLAG: --version="false" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427135 4618 flags.go:64] FLAG: --vmodule="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427154 4618 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427159 4618 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427253 4618 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427257 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427262 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427265 4618 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427269 4618 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427272 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427276 4618 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427279 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427282 4618 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427285 4618 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427288 4618 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427292 4618 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427295 4618 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427298 4618 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427301 4618 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427305 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427308 4618 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427311 4618 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427314 4618 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427317 4618 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427321 4618 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427324 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427327 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427330 4618 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427333 4618 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427337 4618 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427339 4618 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427342 4618 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427346 4618 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427348 4618 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427351 4618 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427355 4618 feature_gate.go:330] unrecognized feature gate: Example Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427358 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427362 4618 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427366 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427369 4618 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427373 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427376 4618 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427381 4618 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427385 4618 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427388 4618 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427392 4618 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427396 4618 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427400 4618 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427404 4618 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427407 4618 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427411 4618 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427414 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427418 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427421 4618 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427424 4618 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427428 4618 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427432 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427435 4618 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427438 4618 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427441 4618 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427444 4618 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427447 4618 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427450 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427464 4618 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427467 4618 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427470 4618 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427474 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427477 4618 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427480 4618 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427483 4618 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427486 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427490 4618 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427494 4618 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427497 4618 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.427501 4618 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.427513 4618 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.434254 4618 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.434282 4618 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434361 4618 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434374 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434379 4618 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434384 4618 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434387 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434391 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434395 4618 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434399 4618 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434402 4618 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434405 4618 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434409 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434412 4618 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434415 4618 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434418 4618 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434421 4618 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434424 4618 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434427 4618 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434430 4618 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434433 4618 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434436 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434439 4618 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434442 4618 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434445 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434448 4618 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434451 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434463 4618 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434467 4618 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434471 4618 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434475 4618 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434479 4618 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434484 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434488 4618 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434491 4618 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434495 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434505 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434509 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434512 4618 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434517 4618 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434522 4618 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434526 4618 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434530 4618 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434534 4618 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434537 4618 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434541 4618 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434544 4618 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434548 4618 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434552 4618 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434555 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434558 4618 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434561 4618 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434565 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434568 4618 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434571 4618 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434575 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434578 4618 feature_gate.go:330] unrecognized feature gate: Example Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434581 4618 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434585 4618 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434588 4618 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434591 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434594 4618 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434597 4618 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434600 4618 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434603 4618 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434606 4618 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434609 4618 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434612 4618 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434615 4618 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434618 4618 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434622 4618 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434625 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434629 4618 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.434635 4618 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434946 4618 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434953 4618 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434957 4618 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434961 4618 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434964 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434968 4618 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434972 4618 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434975 4618 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434978 4618 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434981 4618 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434984 4618 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434988 4618 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434991 4618 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434993 4618 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434996 4618 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.434999 4618 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435003 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435006 4618 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435009 4618 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435012 4618 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435015 4618 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435018 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435022 4618 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435025 4618 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435028 4618 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435031 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435034 4618 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435037 4618 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435041 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435045 4618 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435049 4618 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435053 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435056 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435060 4618 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435064 4618 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435067 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435070 4618 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435074 4618 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435077 4618 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435082 4618 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435087 4618 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435091 4618 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435096 4618 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435100 4618 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435103 4618 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435106 4618 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435109 4618 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435113 4618 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435116 4618 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435119 4618 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435122 4618 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435125 4618 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435128 4618 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435132 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435135 4618 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435154 4618 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435158 4618 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435161 4618 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435164 4618 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435168 4618 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435171 4618 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435174 4618 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435177 4618 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435181 4618 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435184 4618 feature_gate.go:330] unrecognized feature gate: Example Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435188 4618 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435192 4618 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435196 4618 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435200 4618 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435205 4618 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.435216 4618 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.435222 4618 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.435376 4618 server.go:940] "Client rotation is on, will bootstrap in background" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.438240 4618 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.438302 4618 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.439036 4618 server.go:997] "Starting client certificate rotation" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.439060 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.439652 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-18 01:19:14.306177061 +0000 UTC Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.439701 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.448961 4618 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.451094 4618 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.451497 4618 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.460228 4618 log.go:25] "Validated CRI v1 runtime API" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.478035 4618 log.go:25] "Validated CRI v1 image API" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.479433 4618 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.482181 4618 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-21-09-00-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.482209 4618 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.494134 4618 manager.go:217] Machine: {Timestamp:2026-01-21 09:03:21.49260602 +0000 UTC m=+0.243073357 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0573d94e-a416-4c81-b057-07a8619fdfca BootID:ba5df0e4-fb86-421b-948d-88ebb5510825 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:21:71:35 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:21:71:35 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:4c:73:b6 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:5d:bf:78 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:a3:3d:19 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:22:18:10 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:2e:36:7b:8b:72:36 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:f0:54:6f:71:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.494332 4618 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.494404 4618 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.494915 4618 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495050 4618 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495079 4618 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495256 4618 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495265 4618 container_manager_linux.go:303] "Creating device plugin manager" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495577 4618 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495602 4618 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495674 4618 state_mem.go:36] "Initialized new in-memory state store" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.495749 4618 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.497188 4618 kubelet.go:418] "Attempting to sync node with API server" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.497206 4618 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.497218 4618 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.497226 4618 kubelet.go:324] "Adding apiserver pod source" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.497236 4618 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.499594 4618 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.500236 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.500270 4618 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.500297 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.500322 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.500366 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.501722 4618 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502607 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502626 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502633 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502639 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502648 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502655 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502661 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502670 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502676 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502683 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502705 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.502712 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.503515 4618 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.503968 4618 server.go:1280] "Started kubelet" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.505762 4618 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.505794 4618 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 09:03:21 crc systemd[1]: Started Kubernetes Kubelet. Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.506266 4618 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.506209 4618 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507585 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507613 4618 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507713 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 06:52:35.781309846 +0000 UTC Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507846 4618 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507863 4618 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.507873 4618 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.507891 4618 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.508055 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="200ms" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.508452 4618 factory.go:55] Registering systemd factory Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.508482 4618 factory.go:221] Registration of the systemd container factory successfully Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511023 4618 server.go:460] "Adding debug handlers to kubelet server" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511620 4618 factory.go:153] Registering CRI-O factory Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511655 4618 factory.go:221] Registration of the crio container factory successfully Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511705 4618 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511720 4618 factory.go:103] Registering Raw factory Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.511732 4618 manager.go:1196] Started watching for new ooms in manager Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.512886 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.512946 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.513690 4618 manager.go:319] Starting recovery of all containers Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.511384 4618 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.98:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188cb393e933f3f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 09:03:21.503822841 +0000 UTC m=+0.254290157,LastTimestamp:2026-01-21 09:03:21.503822841 +0000 UTC m=+0.254290157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514530 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514612 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514664 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514714 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514773 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514823 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514868 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514914 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.514988 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515042 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515091 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515167 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515219 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515267 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515345 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515396 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515442 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515498 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515567 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515619 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515673 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515725 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515773 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515821 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515874 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515929 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.515980 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516028 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516074 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516122 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516203 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516253 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516299 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516363 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516411 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516468 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516524 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516573 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516619 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516666 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516713 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516767 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516814 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516859 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516908 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.516955 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517006 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517074 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517124 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517188 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517235 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517280 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517340 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517390 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517437 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517502 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517557 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517612 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517661 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517706 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517753 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517798 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517851 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517901 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517948 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.517996 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518042 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518088 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518154 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518208 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518254 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518309 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518361 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518414 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518474 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518523 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518571 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518618 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518668 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518714 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518759 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518804 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518850 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518903 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.518952 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519000 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519046 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519092 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519153 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519258 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519310 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519357 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519405 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519452 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519521 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519570 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519617 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519663 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519712 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519765 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519813 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519877 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519923 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.519968 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520027 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520083 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520133 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520199 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520265 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520317 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520371 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520420 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520485 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520536 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.520584 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521069 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521169 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521228 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521510 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521573 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521630 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521690 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521742 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521796 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521845 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521898 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.521954 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522004 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522055 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522105 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522178 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522389 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522443 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522521 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522574 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522630 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522690 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.522743 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523295 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523359 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523412 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523473 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523537 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523592 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523810 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523860 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523915 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.523968 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524032 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524082 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524131 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524202 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524262 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524320 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524373 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524423 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524486 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524539 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524593 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524644 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524699 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524780 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524833 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524848 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524865 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524877 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524888 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524900 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524910 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524921 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524931 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524940 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524953 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524963 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524972 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.524992 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525003 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525016 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525026 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525036 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525047 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525056 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525069 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525078 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525087 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525098 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525123 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525136 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525175 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525190 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525204 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525213 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525225 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525238 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525248 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525259 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525269 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525282 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525293 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525302 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525314 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525375 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525391 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525402 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.525416 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.527013 4618 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.527042 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.527056 4618 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.527065 4618 reconstruct.go:97] "Volume reconstruction finished" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.527071 4618 reconciler.go:26] "Reconciler: start to sync state" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.528297 4618 manager.go:324] Recovery completed Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.535063 4618 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.536341 4618 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.536418 4618 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.536529 4618 kubelet.go:2335] "Starting kubelet main sync loop" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.536664 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.536870 4618 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.537246 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.537282 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.537720 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.537790 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.537842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.538368 4618 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.538400 4618 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.538415 4618 state_mem.go:36] "Initialized new in-memory state store" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.545087 4618 policy_none.go:49] "None policy: Start" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.546082 4618 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.546108 4618 state_mem.go:35] "Initializing new in-memory state store" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.588855 4618 manager.go:334] "Starting Device Plugin manager" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.588888 4618 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.588899 4618 server.go:79] "Starting device plugin registration server" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.589126 4618 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.589173 4618 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.589277 4618 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.589355 4618 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.589365 4618 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.596932 4618 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.637246 4618 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.637343 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638065 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638100 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638111 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638263 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638487 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638526 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.638940 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639021 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639051 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639076 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639097 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639630 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639654 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.639662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640374 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640409 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640511 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640590 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.640613 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641742 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641761 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641769 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.641992 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642059 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642085 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642495 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642519 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642527 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642611 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642633 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642637 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.642659 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.643287 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.643316 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.643324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.689501 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.690005 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.690031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.690040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.690055 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.690839 4618 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.98:6443: connect: connection refused" node="crc" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.709028 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="400ms" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729571 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729630 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729669 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729703 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729723 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729742 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729777 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729806 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729844 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729864 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729915 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.729971 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.730015 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.730079 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.730113 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831535 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831577 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831600 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831619 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831633 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831646 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831638 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831658 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831671 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831687 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831701 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831708 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831714 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831733 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831738 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831744 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831753 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831759 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831771 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831780 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831788 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831799 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831806 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831774 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831818 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831830 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831846 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831790 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.831878 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.891250 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.892036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.892065 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.892073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.892090 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:21 crc kubenswrapper[4618]: E0121 09:03:21.892434 4618 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.98:6443: connect: connection refused" node="crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.960681 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.975340 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.980480 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-957a246230310f68ec4ec9ae8a06c2eef39315828dea7a7e4363c4137a65bee4 WatchSource:0}: Error finding container 957a246230310f68ec4ec9ae8a06c2eef39315828dea7a7e4363c4137a65bee4: Status 404 returned error can't find the container with id 957a246230310f68ec4ec9ae8a06c2eef39315828dea7a7e4363c4137a65bee4 Jan 21 09:03:21 crc kubenswrapper[4618]: I0121 09:03:21.981709 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.993879 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-90199ba0bb86cd1f2c70570f3e937cefe47960483037d0ec2e69ba64cd9301f5 WatchSource:0}: Error finding container 90199ba0bb86cd1f2c70570f3e937cefe47960483037d0ec2e69ba64cd9301f5: Status 404 returned error can't find the container with id 90199ba0bb86cd1f2c70570f3e937cefe47960483037d0ec2e69ba64cd9301f5 Jan 21 09:03:21 crc kubenswrapper[4618]: W0121 09:03:21.996725 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a6bc4c97123b7679f1c498cbb768cc5a7eca37cd01fe16f76b83ad45973e2867 WatchSource:0}: Error finding container a6bc4c97123b7679f1c498cbb768cc5a7eca37cd01fe16f76b83ad45973e2867: Status 404 returned error can't find the container with id a6bc4c97123b7679f1c498cbb768cc5a7eca37cd01fe16f76b83ad45973e2867 Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.005247 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.008862 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 09:03:22 crc kubenswrapper[4618]: W0121 09:03:22.014000 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2d326b8c04047e618f484468b12113200fa81f0f0681c613759040143bab3bd7 WatchSource:0}: Error finding container 2d326b8c04047e618f484468b12113200fa81f0f0681c613759040143bab3bd7: Status 404 returned error can't find the container with id 2d326b8c04047e618f484468b12113200fa81f0f0681c613759040143bab3bd7 Jan 21 09:03:22 crc kubenswrapper[4618]: W0121 09:03:22.017648 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9bd2475725a5f6b675f884c4b5cf545455c8efb296b48a29973376239f213063 WatchSource:0}: Error finding container 9bd2475725a5f6b675f884c4b5cf545455c8efb296b48a29973376239f213063: Status 404 returned error can't find the container with id 9bd2475725a5f6b675f884c4b5cf545455c8efb296b48a29973376239f213063 Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.110120 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="800ms" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.293475 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.294462 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.294490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.294502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.294524 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.294791 4618 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.98:6443: connect: connection refused" node="crc" Jan 21 09:03:22 crc kubenswrapper[4618]: W0121 09:03:22.414989 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.415054 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.507772 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 19:19:01.481190193 +0000 UTC Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.508071 4618 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.540639 4618 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25" exitCode=0 Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.540710 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.540784 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9bd2475725a5f6b675f884c4b5cf545455c8efb296b48a29973376239f213063"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.540850 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.541787 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.541835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.541845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.542033 4618 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2" exitCode=0 Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.542079 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.542097 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2d326b8c04047e618f484468b12113200fa81f0f0681c613759040143bab3bd7"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.542168 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.543261 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.543290 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.543308 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.546098 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.546128 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6bc4c97123b7679f1c498cbb768cc5a7eca37cd01fe16f76b83ad45973e2867"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.547266 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649" exitCode=0 Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.547311 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.547363 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90199ba0bb86cd1f2c70570f3e937cefe47960483037d0ec2e69ba64cd9301f5"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.547467 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.548500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.548521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.548529 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.549274 4618 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5" exitCode=0 Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.549310 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.549324 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"957a246230310f68ec4ec9ae8a06c2eef39315828dea7a7e4363c4137a65bee4"} Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.549387 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.549847 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.550439 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.550460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.550468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.550987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.551026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:22 crc kubenswrapper[4618]: I0121 09:03:22.551036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:22 crc kubenswrapper[4618]: W0121 09:03:22.643705 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.643845 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:22 crc kubenswrapper[4618]: W0121 09:03:22.660568 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.660620 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:22 crc kubenswrapper[4618]: E0121 09:03:22.911823 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="1.6s" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.095118 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.095992 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.096027 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.096036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.096059 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:23 crc kubenswrapper[4618]: E0121 09:03:23.096398 4618 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.98:6443: connect: connection refused" node="crc" Jan 21 09:03:23 crc kubenswrapper[4618]: W0121 09:03:23.104891 4618 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.98:6443: connect: connection refused Jan 21 09:03:23 crc kubenswrapper[4618]: E0121 09:03:23.104956 4618 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.98:6443: connect: connection refused" logger="UnhandledError" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.507903 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:01:12.65126989 +0000 UTC Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.552713 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.552745 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.552754 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.552819 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.553365 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.553965 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.553987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.553995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.555950 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.555977 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.555987 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.556028 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.556509 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.556528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.556536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558744 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558767 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558777 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558785 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558792 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.558843 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.559292 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.559310 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.559318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.560652 4618 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce" exitCode=0 Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.560689 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.560750 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.561163 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.561181 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.561188 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.562763 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f5e1cb6f08fcde9f0cc23af14feb0b5639b956cfe132528c9a2cf70cd4104b49"} Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.562811 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.563235 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.563254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:23 crc kubenswrapper[4618]: I0121 09:03:23.563263 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.508896 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 22:03:30.213590378 +0000 UTC Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.566309 4618 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674" exitCode=0 Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.566404 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.566673 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674"} Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.566794 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567080 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567281 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567303 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.567310 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.697104 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.697811 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.697848 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.697858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:24 crc kubenswrapper[4618]: I0121 09:03:24.697882 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.108080 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.108214 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.108260 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.109000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.109027 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.109034 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.509814 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 09:31:23.528874846 +0000 UTC Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571102 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7"} Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571153 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4"} Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571165 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f"} Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571173 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79"} Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571182 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86"} Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571266 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571796 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571819 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:25 crc kubenswrapper[4618]: I0121 09:03:25.571827 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.510600 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:20:50.718896485 +0000 UTC Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.659719 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.660050 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.660081 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.660847 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.660878 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.660887 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.755642 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.755774 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.756460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.756480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:26 crc kubenswrapper[4618]: I0121 09:03:26.756488 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.511604 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 18:56:58.944974404 +0000 UTC Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.721357 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.721477 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.722060 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.722088 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:27 crc kubenswrapper[4618]: I0121 09:03:27.722097 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.508133 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.508229 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.509099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.509133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.509160 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.511841 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:10:11.036017108 +0000 UTC Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.512332 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.576577 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.577226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.577258 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:28 crc kubenswrapper[4618]: I0121 09:03:28.577268 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.512361 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:42:58.498794802 +0000 UTC Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.979585 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.979934 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.980818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.980856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:29 crc kubenswrapper[4618]: I0121 09:03:29.980866 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:30 crc kubenswrapper[4618]: I0121 09:03:30.513499 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 03:59:03.31192227 +0000 UTC Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.514300 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:39:58.959245557 +0000 UTC Jan 21 09:03:31 crc kubenswrapper[4618]: E0121 09:03:31.597614 4618 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.654051 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.654176 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.654885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.654912 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:31 crc kubenswrapper[4618]: I0121 09:03:31.654921 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.462504 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.462657 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.463630 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.463652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.463660 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.466063 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.515233 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 17:09:26.707338488 +0000 UTC Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.546737 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.546821 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.547549 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.547571 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.547578 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.582329 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.582973 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.582998 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:32 crc kubenswrapper[4618]: I0121 09:03:32.583006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.508092 4618 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.515200 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.515318 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:54:55.250523513 +0000 UTC Jan 21 09:03:33 crc kubenswrapper[4618]: E0121 09:03:33.554747 4618 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.584176 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.584782 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.584806 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.584813 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.755316 4618 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.755366 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.758168 4618 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 09:03:33 crc kubenswrapper[4618]: I0121 09:03:33.758354 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 09:03:34 crc kubenswrapper[4618]: I0121 09:03:34.515640 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 17:40:40.354336093 +0000 UTC Jan 21 09:03:35 crc kubenswrapper[4618]: I0121 09:03:35.462718 4618 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 09:03:35 crc kubenswrapper[4618]: I0121 09:03:35.462764 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 09:03:35 crc kubenswrapper[4618]: I0121 09:03:35.516760 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 08:35:43.543359549 +0000 UTC Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.517408 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:19:23.140244678 +0000 UTC Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.664981 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.665096 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.665772 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.665807 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.665819 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:36 crc kubenswrapper[4618]: I0121 09:03:36.668107 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.518398 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 13:53:55.054927709 +0000 UTC Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.563843 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.573286 4618 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.591296 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.591822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.591844 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:37 crc kubenswrapper[4618]: I0121 09:03:37.591853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.518748 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 10:22:05.814179234 +0000 UTC Jan 21 09:03:38 crc kubenswrapper[4618]: E0121 09:03:38.752232 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754100 4618 trace.go:236] Trace[1423796111]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 09:03:25.727) (total time: 13026ms): Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1423796111]: ---"Objects listed" error: 13026ms (09:03:38.754) Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1423796111]: [13.026121536s] [13.026121536s] END Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754120 4618 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754100 4618 trace.go:236] Trace[1960178802]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 09:03:25.069) (total time: 13684ms): Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1960178802]: ---"Objects listed" error: 13684ms (09:03:38.754) Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1960178802]: [13.684514023s] [13.684514023s] END Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754201 4618 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754497 4618 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754515 4618 trace.go:236] Trace[480636697]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 09:03:25.505) (total time: 13248ms): Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[480636697]: ---"Objects listed" error: 13248ms (09:03:38.754) Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[480636697]: [13.248653578s] [13.248653578s] END Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.754527 4618 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.755284 4618 trace.go:236] Trace[1533533631]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 09:03:25.201) (total time: 13553ms): Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1533533631]: ---"Objects listed" error: 13553ms (09:03:38.755) Jan 21 09:03:38 crc kubenswrapper[4618]: Trace[1533533631]: [13.553445511s] [13.553445511s] END Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.755301 4618 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 09:03:38 crc kubenswrapper[4618]: E0121 09:03:38.755368 4618 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789330 4618 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60272->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789374 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60272->192.168.126.11:17697: read: connection reset by peer" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789525 4618 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60286->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789566 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60286->192.168.126.11:17697: read: connection reset by peer" Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789770 4618 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 09:03:38 crc kubenswrapper[4618]: I0121 09:03:38.789788 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.509928 4618 apiserver.go:52] "Watching apiserver" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512216 4618 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512380 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512683 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.512810 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512743 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512729 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512752 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512767 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.512916 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.513382 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.513303 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.514496 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.514707 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.514762 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.515107 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.515200 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.515313 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.515735 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.516689 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.516697 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.519097 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 09:45:20.246933481 +0000 UTC Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.534450 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.545800 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.551863 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.558585 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.564953 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.571335 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.576917 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.581900 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.596338 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.597690 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47" exitCode=255 Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.597711 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47"} Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.604696 4618 scope.go:117] "RemoveContainer" containerID="97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.604814 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.605585 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.608393 4618 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.613299 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.621074 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.628086 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.634995 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.641239 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659703 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659738 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659767 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659783 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659800 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659814 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659827 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.659841 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660051 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660072 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660087 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660102 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660117 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660133 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660164 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660179 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660283 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660368 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660450 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660552 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660638 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660657 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660666 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660690 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660734 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660757 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660818 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660844 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660866 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660885 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660903 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660957 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660994 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661090 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661126 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661168 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661188 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661208 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661227 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661263 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661281 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661300 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661319 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661340 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661359 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661399 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661420 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661454 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661473 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661491 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661511 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661529 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661550 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661570 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661594 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661612 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661631 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661648 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661665 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661684 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661720 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661736 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661751 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661764 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661780 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661794 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661808 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661824 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661840 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661854 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661868 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661883 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661898 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661914 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661930 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661956 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661970 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661983 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661999 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662013 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662027 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662041 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662056 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662070 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662084 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662098 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662115 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662129 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662171 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662187 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662214 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662228 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662242 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662256 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662270 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662284 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662299 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662313 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662329 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662345 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662362 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662378 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662391 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662406 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662419 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662434 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662465 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662481 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662495 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662510 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662531 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662545 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662561 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662576 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662592 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662607 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662623 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662639 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662652 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662667 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662682 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662697 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662711 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662725 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662741 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662755 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662770 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662784 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662800 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662816 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662831 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662847 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662863 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662879 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662894 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662910 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662926 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662955 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662972 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662988 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663003 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663019 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663035 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663049 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663064 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663079 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663094 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663108 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663123 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663154 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663170 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663186 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663202 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663222 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663240 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663256 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660770 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660933 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.660971 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661009 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661084 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661157 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661310 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661413 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661479 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661493 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661602 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661813 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661844 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.661902 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662317 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662379 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662475 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662507 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662607 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662789 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662897 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.662972 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.663455 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:03:40.163439996 +0000 UTC m=+18.913907313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663271 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663482 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663503 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663522 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663538 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663553 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663567 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663602 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663630 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663675 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663698 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663714 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663756 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663774 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663789 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663803 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663818 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663831 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663836 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663909 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663933 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663965 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663976 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.663982 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664029 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664041 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664059 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664076 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664092 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664116 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664129 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664154 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664191 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664240 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664259 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664277 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664292 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664308 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664312 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664316 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664324 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664341 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664369 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664391 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664408 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664424 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664442 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664459 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664512 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664530 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664545 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664562 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664578 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664595 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664610 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664625 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664660 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664678 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664695 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664713 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664729 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664749 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664766 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664781 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664798 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664818 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664835 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664853 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664870 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664886 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664919 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664929 4618 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664938 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664961 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664973 4618 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664982 4618 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664991 4618 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665000 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665008 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665017 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665025 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665034 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665042 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665050 4618 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665059 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665067 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665076 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665085 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665095 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665103 4618 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665112 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665149 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665159 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665172 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665181 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665189 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665197 4618 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665206 4618 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665215 4618 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665224 4618 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665232 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665250 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665260 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665269 4618 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665277 4618 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665286 4618 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665296 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665304 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665312 4618 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665320 4618 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.667074 4618 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664359 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664544 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664549 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664689 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.664871 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665198 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665213 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665477 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665498 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665537 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665556 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665913 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665952 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665960 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.665969 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666081 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666125 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666298 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666300 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666311 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666460 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.666521 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.667804 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.667922 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668003 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668082 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668104 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668157 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668195 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668380 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668397 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668536 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668612 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668548 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668721 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.668765 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.668807 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:40.168796508 +0000 UTC m=+18.919263815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668873 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668908 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668934 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668974 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.668999 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.669032 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.669111 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:40.169101164 +0000 UTC m=+18.919568481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669113 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669177 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669343 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669373 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669457 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669559 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669670 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669841 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669881 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669896 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669963 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.669993 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670176 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670433 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670488 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670635 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670712 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670875 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.670987 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671069 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671072 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671130 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671190 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671233 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671426 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671491 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671504 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671609 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671711 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671758 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671814 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671847 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.671937 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672129 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672192 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672239 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672322 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672386 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672437 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672710 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672763 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672851 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672913 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.672867 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.673202 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.673305 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.673316 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.673484 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674051 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674073 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674258 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674394 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674575 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.674689 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.675661 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.675675 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.675736 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.676602 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.677124 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.677756 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.677823 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.677839 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678096 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678100 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678415 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678455 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678506 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678620 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678706 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.678913 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.679029 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.679323 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.679550 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.679682 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.679821 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.680331 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681103 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681122 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681175 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681231 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:40.181213345 +0000 UTC m=+18.931680663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.681497 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.681659 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681726 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681747 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681777 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:39 crc kubenswrapper[4618]: E0121 09:03:39.681811 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:40.181799945 +0000 UTC m=+18.932267262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.681875 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.681981 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682093 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682115 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682203 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682371 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682766 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682910 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.682996 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.683243 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.683304 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.683409 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.683490 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.684649 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688047 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688358 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688497 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688558 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688724 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.688910 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.689109 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.689240 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.691090 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.691334 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692290 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692626 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692894 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692883 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692938 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693106 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693198 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693212 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693389 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693446 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693479 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693562 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693748 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.693750 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.692784 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.698088 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.704481 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.705647 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.706604 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.709581 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766473 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766513 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766577 4618 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766588 4618 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766597 4618 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766606 4618 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766613 4618 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766622 4618 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766630 4618 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766637 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766647 4618 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766652 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766657 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766693 4618 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766704 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766713 4618 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766722 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766730 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766738 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766747 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766755 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766763 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766771 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766780 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766789 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766796 4618 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766804 4618 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766573 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766812 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766859 4618 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766890 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766899 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766908 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766917 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766925 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766932 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766973 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766983 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766991 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.766999 4618 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767010 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767018 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767028 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767055 4618 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767062 4618 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767069 4618 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767077 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767084 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767091 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767103 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767122 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767130 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767164 4618 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767172 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767180 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767188 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767195 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767201 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767209 4618 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767215 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767222 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767245 4618 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767253 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767260 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767275 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767282 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767290 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767299 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767322 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767330 4618 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767337 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767344 4618 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767358 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.767365 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768248 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768308 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768322 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768333 4618 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768351 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768362 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768372 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768382 4618 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768397 4618 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768407 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768431 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768460 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768490 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768503 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768524 4618 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768538 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768577 4618 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768587 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768597 4618 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768613 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768622 4618 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768632 4618 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768643 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768657 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768668 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768690 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768703 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768712 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768722 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768744 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768757 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768767 4618 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768779 4618 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768788 4618 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768823 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768833 4618 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768842 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768869 4618 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768881 4618 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768904 4618 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768912 4618 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768922 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768930 4618 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768937 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768968 4618 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768979 4618 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768986 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.768995 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769003 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769013 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769037 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769044 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769055 4618 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769063 4618 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769071 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769078 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769113 4618 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769122 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769130 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769137 4618 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769163 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769193 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769201 4618 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769208 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769219 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769227 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769246 4618 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769257 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769265 4618 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769272 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769314 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769325 4618 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769333 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769340 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769359 4618 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769370 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769377 4618 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769385 4618 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769408 4618 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769416 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769424 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769431 4618 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769442 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.769449 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.823983 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.828656 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 09:03:39 crc kubenswrapper[4618]: I0121 09:03:39.832977 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 09:03:39 crc kubenswrapper[4618]: W0121 09:03:39.833518 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-366973d6898f630ac0d21ba20dc47294fc6dc20b5a7b8ea608aedcb4b86a0753 WatchSource:0}: Error finding container 366973d6898f630ac0d21ba20dc47294fc6dc20b5a7b8ea608aedcb4b86a0753: Status 404 returned error can't find the container with id 366973d6898f630ac0d21ba20dc47294fc6dc20b5a7b8ea608aedcb4b86a0753 Jan 21 09:03:39 crc kubenswrapper[4618]: W0121 09:03:39.842828 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-578b3a9ded2b8b8b3eaf94076df4399f3ceedaeb5578c3c60d95e23c0590d167 WatchSource:0}: Error finding container 578b3a9ded2b8b8b3eaf94076df4399f3ceedaeb5578c3c60d95e23c0590d167: Status 404 returned error can't find the container with id 578b3a9ded2b8b8b3eaf94076df4399f3ceedaeb5578c3c60d95e23c0590d167 Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.173327 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.173422 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.173456 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:03:41.173435344 +0000 UTC m=+19.923902662 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.173512 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.173529 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.173576 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.173594 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:41.173582543 +0000 UTC m=+19.924049861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.173623 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:41.173600687 +0000 UTC m=+19.924068005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.274128 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.274211 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274295 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274308 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274318 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274315 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274360 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274373 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274342 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:41.274335243 +0000 UTC m=+20.024802560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:40 crc kubenswrapper[4618]: E0121 09:03:40.274443 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:41.274430514 +0000 UTC m=+20.024897830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.519788 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:52:51.333688698 +0000 UTC Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.600405 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"578b3a9ded2b8b8b3eaf94076df4399f3ceedaeb5578c3c60d95e23c0590d167"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.601560 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.601584 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.601596 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dede31c254fa3b06a38c4cacdf5f67340e79786cd66691854a6f01d4476d7429"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.602886 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.602910 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"366973d6898f630ac0d21ba20dc47294fc6dc20b5a7b8ea608aedcb4b86a0753"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.604388 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.605593 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92"} Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.605810 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.623796 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.631104 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.638347 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.645072 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.652717 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.659838 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.667427 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.676762 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.684104 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.691521 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.698483 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.706948 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.715499 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:40 crc kubenswrapper[4618]: I0121 09:03:40.722896 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:40Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.181371 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.181490 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:03:43.181460768 +0000 UTC m=+21.931928086 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.181525 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.181548 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.181624 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.181677 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:43.181670035 +0000 UTC m=+21.932137352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.181688 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.181753 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:43.181737332 +0000 UTC m=+21.932204659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.282713 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.282755 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282833 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282846 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282855 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282854 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282873 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282890 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282906 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:43.282896421 +0000 UTC m=+22.033363738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.282923 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:43.282912911 +0000 UTC m=+22.033380238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.520128 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 21:42:04.317488255 +0000 UTC Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.537627 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.537650 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.537712 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.537751 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.537840 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.537888 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.540324 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.540880 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.541839 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.542400 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.543241 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.543685 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.544189 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.544975 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.545499 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.546135 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.546279 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.546716 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.547625 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.548040 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.548484 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.549250 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.549679 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.550464 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.550782 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.551270 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.552059 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.552491 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.553301 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.553667 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.553769 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.554534 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.554904 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.555432 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.556313 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.556706 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.557573 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.557954 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.558700 4618 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.558790 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.560205 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.560917 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.561343 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.561598 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.562583 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.563153 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.563898 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.564453 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.565338 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.565714 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.566533 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.567074 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.567950 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.568365 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.569108 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.569511 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.569590 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.570489 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.570898 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.571653 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.572301 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.573064 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.573716 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.574124 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.577787 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.585598 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.594169 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.607408 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d"} Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.620473 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.628038 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.635517 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.642563 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.651244 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.659300 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.667013 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.955540 4618 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.956694 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.956722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.956731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.956761 4618 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.960919 4618 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.961110 4618 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.963816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.963845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.963855 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.963866 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.963874 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:41Z","lastTransitionTime":"2026-01-21T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.975601 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.977586 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.977606 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.977615 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.977625 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.977633 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:41Z","lastTransitionTime":"2026-01-21T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.985032 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.986983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.987009 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.987017 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.987026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.987033 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:41Z","lastTransitionTime":"2026-01-21T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:41 crc kubenswrapper[4618]: E0121 09:03:41.995209 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.997422 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.997450 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.997460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.997471 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:41 crc kubenswrapper[4618]: I0121 09:03:41.997478 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:41Z","lastTransitionTime":"2026-01-21T09:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: E0121 09:03:42.005103 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.008171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.008201 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.008210 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.008222 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.008230 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: E0121 09:03:42.015775 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: E0121 09:03:42.015902 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.016980 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.017007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.017016 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.017027 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.017035 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.118728 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.118760 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.118769 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.118781 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.118790 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.220352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.220380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.220390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.220401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.220409 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.322245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.322297 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.322312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.322326 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.322334 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.423883 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.423918 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.423928 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.423941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.423950 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.443472 4618 csr.go:261] certificate signing request csr-p6ngg is approved, waiting to be issued Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.466705 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.471229 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.479574 4618 csr.go:257] certificate signing request csr-p6ngg is issued Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.480031 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.481662 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.494792 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.505746 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.514811 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.521217 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:06:01.33134257 +0000 UTC Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.523120 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.525433 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.525462 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.525471 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.525483 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.525490 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.532989 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.542044 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.568133 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.581757 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.590038 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.597965 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.605532 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.619212 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.626811 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.626857 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.626868 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.626880 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.626888 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.627762 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.638912 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.646799 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.654728 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.664539 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.672185 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.680845 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.682830 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.690520 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.699378 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.707526 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.715587 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.723059 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.728226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.728270 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.728279 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.728291 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.728298 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.829850 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.829886 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.829894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.829908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.829917 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.931594 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.931630 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.931639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.931651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.931660 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:42Z","lastTransitionTime":"2026-01-21T09:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.953994 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-drdgl"] Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.954284 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.955774 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.955938 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.955945 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.972235 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.980824 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:42 crc kubenswrapper[4618]: I0121 09:03:42.994629 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:42Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.007711 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.017067 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.026626 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.033133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.033178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.033189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.033201 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.033211 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.034433 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.043551 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.054224 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.065254 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.095619 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2972a05a-04de-4c13-8436-38cbbac3a4a3-hosts-file\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.095682 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5d9\" (UniqueName: \"kubernetes.io/projected/2972a05a-04de-4c13-8436-38cbbac3a4a3-kube-api-access-2h5d9\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.135104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.135136 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.135160 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.135173 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.135182 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.195968 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.196071 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.196094 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5d9\" (UniqueName: \"kubernetes.io/projected/2972a05a-04de-4c13-8436-38cbbac3a4a3-kube-api-access-2h5d9\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.196209 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.196422 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:03:47.196118762 +0000 UTC m=+25.946586080 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.196464 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:47.196442205 +0000 UTC m=+25.946909522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.196537 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2972a05a-04de-4c13-8436-38cbbac3a4a3-hosts-file\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.196619 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2972a05a-04de-4c13-8436-38cbbac3a4a3-hosts-file\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.196675 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.196763 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.196800 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:47.196790062 +0000 UTC m=+25.947257379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.209394 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5d9\" (UniqueName: \"kubernetes.io/projected/2972a05a-04de-4c13-8436-38cbbac3a4a3-kube-api-access-2h5d9\") pod \"node-resolver-drdgl\" (UID: \"2972a05a-04de-4c13-8436-38cbbac3a4a3\") " pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.237520 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.237582 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.237591 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.237603 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.237612 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.263797 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-drdgl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.298008 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.298059 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298167 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298182 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298212 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298187 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298224 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298232 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298273 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:47.298259228 +0000 UTC m=+26.048726545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.298289 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:47.29828185 +0000 UTC m=+26.048749168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.330937 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-m6jz5"] Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.331181 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2bm47"] Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.331331 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.331391 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-24dd7"] Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.331494 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.332478 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.333061 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.333973 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334232 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334389 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334502 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334639 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334760 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334796 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334919 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334974 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.334768 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.335084 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.339956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.339977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.339993 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.340004 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.340013 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.344198 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.353297 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.363496 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.377051 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.386319 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.394957 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.403434 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.413387 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.422130 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.431750 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.440305 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.441321 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.441350 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.441359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.441372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.441381 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.452830 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.461257 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.469030 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.477857 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.481093 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-21 08:58:42 +0000 UTC, rotation deadline is 2026-10-23 16:38:38.759706343 +0000 UTC Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.481155 4618 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6607h34m55.278554122s for next certificate rotation Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.485606 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.494406 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499512 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbtt5\" (UniqueName: \"kubernetes.io/projected/32082919-a07c-414d-b784-1ad042460385-kube-api-access-lbtt5\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499548 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-kubelet\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499566 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-tuning-conf-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499580 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-os-release\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499594 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-socket-dir-parent\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499678 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499710 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f819fb41-8eb7-4f8f-85f9-752aa5716cca-mcd-auth-proxy-config\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499728 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f819fb41-8eb7-4f8f-85f9-752aa5716cca-rootfs\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499744 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-binary-copy\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499795 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-k8s-cni-cncf-io\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499815 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-netns\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499843 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499859 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-cnibin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499874 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-cni-binary-copy\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499896 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-bin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499914 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-multus\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499928 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dps94\" (UniqueName: \"kubernetes.io/projected/052a66c4-94ce-4336-93f6-1d0023e58cc4-kube-api-access-dps94\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.499975 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-etc-kubernetes\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500011 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-multus-certs\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500054 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-system-cni-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500087 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f819fb41-8eb7-4f8f-85f9-752aa5716cca-proxy-tls\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500128 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-system-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500162 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-daemon-config\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500180 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-conf-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500197 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-hostroot\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500782 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sddf\" (UniqueName: \"kubernetes.io/projected/f819fb41-8eb7-4f8f-85f9-752aa5716cca-kube-api-access-8sddf\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.500838 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-cnibin\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.501063 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-os-release\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.506593 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.514386 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.522095 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 01:39:11.025792148 +0000 UTC Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.522191 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.528889 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.535866 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.537015 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.537081 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.537185 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.537212 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.537271 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.537331 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.542896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.542921 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.542931 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.542941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.542949 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.544519 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.553797 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602273 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-tuning-conf-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602300 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-os-release\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602318 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-socket-dir-parent\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602334 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602350 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f819fb41-8eb7-4f8f-85f9-752aa5716cca-mcd-auth-proxy-config\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602365 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f819fb41-8eb7-4f8f-85f9-752aa5716cca-rootfs\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602380 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-binary-copy\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602393 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-k8s-cni-cncf-io\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602406 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-netns\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602406 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-os-release\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602430 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602443 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-cnibin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602456 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-cni-binary-copy\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602471 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-bin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602486 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-multus\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602499 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dps94\" (UniqueName: \"kubernetes.io/projected/052a66c4-94ce-4336-93f6-1d0023e58cc4-kube-api-access-dps94\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602519 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-etc-kubernetes\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602532 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-multus-certs\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602545 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-system-cni-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602559 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f819fb41-8eb7-4f8f-85f9-752aa5716cca-proxy-tls\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602583 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-system-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602596 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-daemon-config\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602628 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-conf-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602641 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-hostroot\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sddf\" (UniqueName: \"kubernetes.io/projected/f819fb41-8eb7-4f8f-85f9-752aa5716cca-kube-api-access-8sddf\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602668 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-cnibin\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602681 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-os-release\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602694 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbtt5\" (UniqueName: \"kubernetes.io/projected/32082919-a07c-414d-b784-1ad042460385-kube-api-access-lbtt5\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602707 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-kubelet\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602773 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-kubelet\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602801 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f819fb41-8eb7-4f8f-85f9-752aa5716cca-rootfs\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602801 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-tuning-conf-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602832 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-system-cni-dir\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602848 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-cnibin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602871 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-multus-certs\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602914 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-hostroot\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602953 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-socket-dir-parent\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602976 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-system-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.602997 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f819fb41-8eb7-4f8f-85f9-752aa5716cca-mcd-auth-proxy-config\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603023 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603033 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-multus\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603030 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-conf-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603049 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-var-lib-cni-bin\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603047 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-etc-kubernetes\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603122 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-cni-dir\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603164 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-k8s-cni-cncf-io\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603169 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/052a66c4-94ce-4336-93f6-1d0023e58cc4-host-run-netns\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603182 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-os-release\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603267 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32082919-a07c-414d-b784-1ad042460385-cnibin\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603370 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32082919-a07c-414d-b784-1ad042460385-cni-binary-copy\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603529 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-cni-binary-copy\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.603579 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/052a66c4-94ce-4336-93f6-1d0023e58cc4-multus-daemon-config\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.606090 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f819fb41-8eb7-4f8f-85f9-752aa5716cca-proxy-tls\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.613238 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-drdgl" event={"ID":"2972a05a-04de-4c13-8436-38cbbac3a4a3","Type":"ContainerStarted","Data":"758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.613270 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-drdgl" event={"ID":"2972a05a-04de-4c13-8436-38cbbac3a4a3","Type":"ContainerStarted","Data":"2bd49321e4529cf7a5cb562a83efa7f8736c49aa0d6ba64cdc4dfcceb9c3de2e"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.615185 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sddf\" (UniqueName: \"kubernetes.io/projected/f819fb41-8eb7-4f8f-85f9-752aa5716cca-kube-api-access-8sddf\") pod \"machine-config-daemon-2bm47\" (UID: \"f819fb41-8eb7-4f8f-85f9-752aa5716cca\") " pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.615623 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbtt5\" (UniqueName: \"kubernetes.io/projected/32082919-a07c-414d-b784-1ad042460385-kube-api-access-lbtt5\") pod \"multus-additional-cni-plugins-24dd7\" (UID: \"32082919-a07c-414d-b784-1ad042460385\") " pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.616129 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dps94\" (UniqueName: \"kubernetes.io/projected/052a66c4-94ce-4336-93f6-1d0023e58cc4-kube-api-access-dps94\") pod \"multus-m6jz5\" (UID: \"052a66c4-94ce-4336-93f6-1d0023e58cc4\") " pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: E0121 09:03:43.617929 4618 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.621975 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.629480 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.637079 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.642221 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644027 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644395 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644442 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.644449 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.647566 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m6jz5" Jan 21 09:03:43 crc kubenswrapper[4618]: W0121 09:03:43.650493 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf819fb41_8eb7_4f8f_85f9_752aa5716cca.slice/crio-a596bf86b475d75500e16d0983e41be10e71a96905201d8634faa36b9ac716d1 WatchSource:0}: Error finding container a596bf86b475d75500e16d0983e41be10e71a96905201d8634faa36b9ac716d1: Status 404 returned error can't find the container with id a596bf86b475d75500e16d0983e41be10e71a96905201d8634faa36b9ac716d1 Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.653466 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.656663 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-24dd7" Jan 21 09:03:43 crc kubenswrapper[4618]: W0121 09:03:43.656943 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052a66c4_94ce_4336_93f6_1d0023e58cc4.slice/crio-8d93e98d6e9a6b37da6eefe9b622991ba7f6d0830090d6a3638a770a14f89327 WatchSource:0}: Error finding container 8d93e98d6e9a6b37da6eefe9b622991ba7f6d0830090d6a3638a770a14f89327: Status 404 returned error can't find the container with id 8d93e98d6e9a6b37da6eefe9b622991ba7f6d0830090d6a3638a770a14f89327 Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.664182 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.676497 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.691363 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.698963 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-894tg"] Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.701443 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.703634 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.703641 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.705809 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.706111 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.706259 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.706356 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.706383 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.707499 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.733103 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.746194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.746216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.746225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.746237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.746245 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.775199 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.803869 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.803903 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.803922 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.803952 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804012 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804055 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804089 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804110 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804127 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804240 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804278 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58lk\" (UniqueName: \"kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804309 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804326 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804348 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804379 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804395 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804411 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804427 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804441 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.804483 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.815564 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.848262 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.848288 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.848296 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.848308 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.848317 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.855453 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.894707 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.904966 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905001 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58lk\" (UniqueName: \"kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905030 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905047 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905060 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905061 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905119 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905081 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905175 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905193 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905228 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905252 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905273 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905289 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905323 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905344 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905358 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905376 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905380 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905390 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905357 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905412 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905386 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905403 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905426 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905442 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905456 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905483 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905521 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905525 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905519 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905543 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905538 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905794 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905800 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905837 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.905971 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.907692 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.939025 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58lk\" (UniqueName: \"kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk\") pod \"ovnkube-node-894tg\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.949893 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.949922 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.949931 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.949952 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.949962 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:43Z","lastTransitionTime":"2026-01-21T09:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.954782 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:43 crc kubenswrapper[4618]: I0121 09:03:43.998095 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.010908 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:44 crc kubenswrapper[4618]: W0121 09:03:44.018884 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992361e5_8eb9_426d_9eed_afffb0c30615.slice/crio-db7e2558f63516ef0c2a647348cc48b93e4a889b12023e6ba37db3dcad500151 WatchSource:0}: Error finding container db7e2558f63516ef0c2a647348cc48b93e4a889b12023e6ba37db3dcad500151: Status 404 returned error can't find the container with id db7e2558f63516ef0c2a647348cc48b93e4a889b12023e6ba37db3dcad500151 Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.034734 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.051904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.051930 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.051945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.051957 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.051965 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.073306 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.114288 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.153665 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.153691 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.153700 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.153710 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.153719 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.154315 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.195467 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.233721 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.255634 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.255659 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.255668 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.255680 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.255688 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.273443 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.313384 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.352685 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.357801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.357831 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.357840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.357853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.357862 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.393244 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.438718 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.459599 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.459743 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.459751 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.459763 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.459771 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.522211 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 16:55:49.226735594 +0000 UTC Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.560937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.560962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.560972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.560981 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.560989 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.617110 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10" exitCode=0 Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.617169 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.617209 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerStarted","Data":"b862d7c8a24c73ea5fd7ce6dfd1f74cf8a97961bf4e0ffa7230522fa2adb8bc5"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.618783 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" exitCode=0 Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.618880 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.618914 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"db7e2558f63516ef0c2a647348cc48b93e4a889b12023e6ba37db3dcad500151"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.619824 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerStarted","Data":"79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.619855 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerStarted","Data":"8d93e98d6e9a6b37da6eefe9b622991ba7f6d0830090d6a3638a770a14f89327"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.622463 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.622497 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.622509 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"a596bf86b475d75500e16d0983e41be10e71a96905201d8634faa36b9ac716d1"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.628424 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.640971 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.654367 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.662741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.662774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.662782 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.662793 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.662801 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.664114 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.672392 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.680600 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.714648 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.755958 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.764909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.764939 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.764949 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.764961 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.764970 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.793400 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.832980 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.866299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.866328 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.866336 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.866348 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.866356 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.878765 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.914950 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.954518 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.967564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.967592 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.967600 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.967611 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.967619 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:44Z","lastTransitionTime":"2026-01-21T09:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:44 crc kubenswrapper[4618]: I0121 09:03:44.993125 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:44Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.034641 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.069544 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.069573 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.069583 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.069595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.069605 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.074696 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.115128 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.152044 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.171998 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.172038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.172047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.172059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.172069 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.193330 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.246541 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.274272 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.274308 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.274318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.274331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.274339 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.276865 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.314744 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.358887 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.376189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.376221 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.376231 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.376244 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.376252 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.393842 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.435670 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.475228 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.478339 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.478366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.478375 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.478387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.478396 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.513733 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.523025 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 03:32:43.364713374 +0000 UTC Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.537751 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.537792 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:45 crc kubenswrapper[4618]: E0121 09:03:45.537844 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.537801 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:45 crc kubenswrapper[4618]: E0121 09:03:45.537973 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:45 crc kubenswrapper[4618]: E0121 09:03:45.537897 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.554281 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.580490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.580517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.580526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.580535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.580543 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627034 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627061 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627073 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627081 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627088 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.627095 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.628173 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01" exitCode=0 Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.628202 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.638901 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.648883 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.674910 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.681947 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.681977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.681986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.681997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.682005 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.714839 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.754317 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.783362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.783410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.783420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.783432 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.783441 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.794547 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.833298 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.873788 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.884867 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.884892 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.884899 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.884912 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.884919 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.920396 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.953794 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.987083 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.987111 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.987120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.987134 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.987171 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:45Z","lastTransitionTime":"2026-01-21T09:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:45 crc kubenswrapper[4618]: I0121 09:03:45.994819 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:45Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.038103 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.074415 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.088765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.088793 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.088801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.088812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.088820 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.113755 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.190034 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.190059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.190068 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.190080 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.190089 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.292217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.292262 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.292271 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.292283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.292294 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.393807 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.393845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.393856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.393870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.393880 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.495695 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.495724 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.495733 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.495746 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.495754 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.523203 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:22:37.342532105 +0000 UTC Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.543365 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fbzbk"] Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.543640 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.545495 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.545747 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.545818 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.546365 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.555177 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.564228 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.576729 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.584850 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.592267 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.597893 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.597923 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.597931 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.597942 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.597950 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.599095 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.608052 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.615902 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.624714 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.625902 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83f21b2-3e45-4972-a51f-75b5a51495dd-host\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.625956 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zn4\" (UniqueName: \"kubernetes.io/projected/a83f21b2-3e45-4972-a51f-75b5a51495dd-kube-api-access-l7zn4\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.626069 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a83f21b2-3e45-4972-a51f-75b5a51495dd-serviceca\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.631400 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae" exitCode=0 Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.631432 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.634274 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.643675 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.673455 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.699951 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.699981 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.699990 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.700003 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.700013 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.714205 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.726492 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zn4\" (UniqueName: \"kubernetes.io/projected/a83f21b2-3e45-4972-a51f-75b5a51495dd-kube-api-access-l7zn4\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.726538 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a83f21b2-3e45-4972-a51f-75b5a51495dd-serviceca\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.726558 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83f21b2-3e45-4972-a51f-75b5a51495dd-host\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.727403 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83f21b2-3e45-4972-a51f-75b5a51495dd-host\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.728391 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a83f21b2-3e45-4972-a51f-75b5a51495dd-serviceca\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.760997 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zn4\" (UniqueName: \"kubernetes.io/projected/a83f21b2-3e45-4972-a51f-75b5a51495dd-kube-api-access-l7zn4\") pod \"node-ca-fbzbk\" (UID: \"a83f21b2-3e45-4972-a51f-75b5a51495dd\") " pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.773563 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.802244 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.802264 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.802272 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.802283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.802292 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.817544 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.852762 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fbzbk" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.854799 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: W0121 09:03:46.878305 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda83f21b2_3e45_4972_a51f_75b5a51495dd.slice/crio-d47d7c8aaf5dbcd71f3a9dc366f469c1b185f6a080068e278ff252b1646ceb4f WatchSource:0}: Error finding container d47d7c8aaf5dbcd71f3a9dc366f469c1b185f6a080068e278ff252b1646ceb4f: Status 404 returned error can't find the container with id d47d7c8aaf5dbcd71f3a9dc366f469c1b185f6a080068e278ff252b1646ceb4f Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.894074 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.904475 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.904498 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.904506 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.904517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.904526 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:46Z","lastTransitionTime":"2026-01-21T09:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.933621 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:46 crc kubenswrapper[4618]: I0121 09:03:46.973329 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:46Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.006099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.006126 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.006135 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.006162 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.006171 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.013690 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.053336 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.093871 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.107577 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.107602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.107610 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.107619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.107627 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.132930 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.178655 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.209084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.209107 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.209116 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.209126 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.209135 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.215403 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.231201 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.231259 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.231289 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.231382 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.231401 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:03:55.231377343 +0000 UTC m=+33.981844659 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.231435 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:55.231422148 +0000 UTC m=+33.981889464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.231464 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.231526 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:55.23151915 +0000 UTC m=+33.981986467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.254787 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.297960 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.310552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.310580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.310589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.310600 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.310609 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.332073 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.332114 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332217 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332235 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332246 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332247 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332268 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332280 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332282 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:55.332272982 +0000 UTC m=+34.082740300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.332332 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:55.332317056 +0000 UTC m=+34.082784383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.334451 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.373739 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.412320 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.412345 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.412353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.412362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.412370 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.413912 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.514018 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.514057 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.514065 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.514078 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.514086 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.523619 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 09:27:22.974828237 +0000 UTC Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.537098 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.537165 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.537188 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.537247 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.537295 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:47 crc kubenswrapper[4618]: E0121 09:03:47.537416 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.615879 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.615906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.615914 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.615924 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.615932 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.636509 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c" exitCode=0 Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.636570 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.638366 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fbzbk" event={"ID":"a83f21b2-3e45-4972-a51f-75b5a51495dd","Type":"ContainerStarted","Data":"76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.638391 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fbzbk" event={"ID":"a83f21b2-3e45-4972-a51f-75b5a51495dd","Type":"ContainerStarted","Data":"d47d7c8aaf5dbcd71f3a9dc366f469c1b185f6a080068e278ff252b1646ceb4f"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.642016 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.646922 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.658911 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.671743 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.680549 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.689220 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.696006 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.709241 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.718059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.718084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.718092 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.718105 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.718113 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.734328 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.774368 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.814527 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.819827 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.819858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.819868 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.819879 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.819887 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.854737 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.893679 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937512 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937543 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937545 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.937564 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:47Z","lastTransitionTime":"2026-01-21T09:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:47 crc kubenswrapper[4618]: I0121 09:03:47.972677 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:47Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.016487 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.038999 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.039023 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.039031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.039050 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.039058 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.053293 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.094416 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.137852 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.140443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.140468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.140477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.140491 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.140500 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.174842 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.214134 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.241702 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.241726 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.241734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.241745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.241754 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.251514 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.294281 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.332853 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.342981 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.343007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.343015 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.343026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.343035 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.373600 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.414398 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.444331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.444359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.444367 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.444378 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.444387 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.454009 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.493135 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.523777 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:54:03.336164322 +0000 UTC Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.532224 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.545899 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.545935 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.545944 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.545956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.545965 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.573385 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.617281 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.645794 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f" exitCode=0 Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.645831 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.646998 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.647022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.647030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.647041 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.647060 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.653479 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.693681 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.738074 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.752404 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.752433 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.752442 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.752455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.752463 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.773956 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.813496 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.853493 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.854545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.854568 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.854577 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.854588 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.854596 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.895040 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.935125 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.956115 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.956178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.956190 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.956202 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.956212 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:48Z","lastTransitionTime":"2026-01-21T09:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:48 crc kubenswrapper[4618]: I0121 09:03:48.972760 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:48Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.018169 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.054478 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.057566 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.057596 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.057607 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.057621 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.057630 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.093755 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.134162 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.158990 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.159018 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.159027 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.159040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.159059 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.174563 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.214016 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.260570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.260734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.260742 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.260755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.260763 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.363403 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.363436 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.363447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.363465 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.363476 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.465299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.465331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.465340 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.465352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.465363 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.523872 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 04:30:35.304074417 +0000 UTC Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.537204 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.537216 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.537260 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:49 crc kubenswrapper[4618]: E0121 09:03:49.537356 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:49 crc kubenswrapper[4618]: E0121 09:03:49.537537 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:49 crc kubenswrapper[4618]: E0121 09:03:49.537657 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.567207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.567237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.567245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.567258 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.567267 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.649835 4618 generic.go:334] "Generic (PLEG): container finished" podID="32082919-a07c-414d-b784-1ad042460385" containerID="9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9" exitCode=0 Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.649901 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerDied","Data":"9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.654037 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.654307 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.659664 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.671244 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.671266 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.671274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.671286 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.671295 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.672441 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.673586 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.687453 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.696633 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.704278 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.711069 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.719547 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.728866 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.736564 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.743627 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.755405 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.763615 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.771887 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.772909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.772937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.772946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.772959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.772967 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.779014 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.812309 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.853704 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.874313 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.874347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.874355 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.874368 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.874378 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.892566 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.933419 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.975760 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.975793 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.975801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.975813 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.975823 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:49Z","lastTransitionTime":"2026-01-21T09:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:49 crc kubenswrapper[4618]: I0121 09:03:49.984957 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.015079 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.054223 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.077444 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.077472 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.077480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.077493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.077501 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.095465 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.135174 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.174764 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.179023 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.179055 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.179077 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.181313 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.181660 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.214081 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.258712 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.283454 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.283489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.283500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.283513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.283524 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.295100 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.334062 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.374177 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.385670 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.385701 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.385711 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.385723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.385733 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.414876 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.487639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.487676 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.487683 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.487695 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.487704 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.523977 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 23:33:27.59187462 +0000 UTC Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.589941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.589967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.589975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.589985 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.589994 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.658436 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" event={"ID":"32082919-a07c-414d-b784-1ad042460385","Type":"ContainerStarted","Data":"8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.658502 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.658720 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.671092 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.673925 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.686797 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.691366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.691393 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.691401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.691415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.691424 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.697890 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.707391 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.717083 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.725379 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.731895 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.739330 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.777419 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.793252 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.793274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.793282 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.793292 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.793302 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.813981 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.854815 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.895064 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.895117 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.895125 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.895137 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.895159 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.899350 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.933907 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.973883 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:50Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.996972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.997018 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.997026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.997038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:50 crc kubenswrapper[4618]: I0121 09:03:50.997046 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:50Z","lastTransitionTime":"2026-01-21T09:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.013161 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.054933 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.097507 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.098349 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.098375 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.098384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.098397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.098405 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.134729 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.174595 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.200010 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.200037 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.200045 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.200068 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.200085 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.213549 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.253506 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.294449 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.301760 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.301787 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.301795 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.301808 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.301816 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.335412 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.378850 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.403769 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.403798 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.403807 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.403820 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.403829 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.413871 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.440403 4618 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.453514 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.492825 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.505359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.505389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.505397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.505409 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.505417 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.525006 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:48:33.539888225 +0000 UTC Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.534171 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.537402 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.537411 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.537470 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:51 crc kubenswrapper[4618]: E0121 09:03:51.537579 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:51 crc kubenswrapper[4618]: E0121 09:03:51.537644 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:51 crc kubenswrapper[4618]: E0121 09:03:51.537737 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.574362 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.607046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.607072 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.607090 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.607100 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.607108 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.614638 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.655280 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.657607 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.661605 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/0.log" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.663634 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7" exitCode=1 Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.663676 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.664237 4618 scope.go:117] "RemoveContainer" containerID="4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.692262 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.709171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.709200 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.709210 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.709221 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.709230 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.738022 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.775530 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.810401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.810434 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.810443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.810455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.810463 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.815473 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.855369 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.894787 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.912458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.912496 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.912505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.912519 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.912530 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:51Z","lastTransitionTime":"2026-01-21T09:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.938335 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:51 crc kubenswrapper[4618]: I0121 09:03:51.973386 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.013794 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.014725 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.014754 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.014764 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.014777 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.014785 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.056967 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.094008 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.116347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.116383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.116393 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.116408 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.116417 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.134390 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.178275 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.180171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.180199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.180208 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.180221 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.180230 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.189564 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.191798 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.191830 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.191840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.191851 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.191861 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.199931 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.202025 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.202047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.202056 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.202069 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.202077 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.209650 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.211876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.211900 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.211909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.211919 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.211927 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.215609 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.220835 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.222957 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.222991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.223000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.223012 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.223022 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.230501 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.230603 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.231477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.231502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.231511 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.231525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.231532 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.254213 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.294415 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.333426 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.333461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.333476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.333490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.333500 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.334473 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.372745 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.413812 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.436128 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.436183 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.436192 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.436204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.436212 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.457575 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"message\\\":\\\"ient-go/informers/factory.go:160\\\\nI0121 09:03:51.110351 5941 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0121 09:03:51.110359 5941 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 09:03:51.110380 5941 factory.go:656] Stopping watch factory\\\\nI0121 09:03:51.110380 5941 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110390 5941 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 09:03:51.110520 5941 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110630 5941 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110702 5941 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0121 09:03:51.110738 5941 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.493831 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.525962 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:11:15.142097107 +0000 UTC Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.535288 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.538321 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.538347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.538355 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.538367 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.538375 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.579505 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.614362 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.640460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.640490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.640499 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.640513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.640522 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.654972 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.667450 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/1.log" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.667893 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/0.log" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.670014 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" exitCode=1 Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.670050 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.670101 4618 scope.go:117] "RemoveContainer" containerID="4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.670527 4618 scope.go:117] "RemoveContainer" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" Jan 21 09:03:52 crc kubenswrapper[4618]: E0121 09:03:52.670658 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.692703 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.736610 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.742648 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.742675 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.742684 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.742711 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.742724 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.774368 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.814809 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.844274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.844308 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.844318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.844330 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.844339 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.857914 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.893357 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.935078 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.946493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.946518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.946526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.946537 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.946545 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:52Z","lastTransitionTime":"2026-01-21T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:52 crc kubenswrapper[4618]: I0121 09:03:52.972831 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.015623 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.048551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.048580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.048588 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.048602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.048610 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.054580 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.094539 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.136596 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b5c4a24de92ae3be6f0cbcae418e92eff811ac42b645164c990f59f0d1c26f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"message\\\":\\\"ient-go/informers/factory.go:160\\\\nI0121 09:03:51.110351 5941 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0121 09:03:51.110359 5941 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0121 09:03:51.110380 5941 factory.go:656] Stopping watch factory\\\\nI0121 09:03:51.110380 5941 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110390 5941 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0121 09:03:51.110520 5941 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110630 5941 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0121 09:03:51.110702 5941 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0121 09:03:51.110738 5941 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.150217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.150245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.150262 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.150274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.150283 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.173681 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.214121 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.252181 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.252217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.252226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.252239 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.252247 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.254063 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.292247 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.333988 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.354071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.354111 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.354120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.354130 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.354137 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.374005 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.414182 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.455652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.455681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.455689 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.455702 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.455710 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.526952 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 18:58:54.053496928 +0000 UTC Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.537304 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:53 crc kubenswrapper[4618]: E0121 09:03:53.537377 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.537401 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:53 crc kubenswrapper[4618]: E0121 09:03:53.537477 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.537385 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:53 crc kubenswrapper[4618]: E0121 09:03:53.537549 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.557296 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.557321 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.557329 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.557339 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.557347 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.659341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.659460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.659527 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.659595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.659660 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.673196 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/1.log" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.675537 4618 scope.go:117] "RemoveContainer" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" Jan 21 09:03:53 crc kubenswrapper[4618]: E0121 09:03:53.675649 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.683453 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.692213 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.699623 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.706237 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.717659 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.725836 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.733406 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.740477 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.761283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.761324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.761335 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.761349 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.761359 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.772453 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.813625 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.855878 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.862949 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.862970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.862978 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.862988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.862996 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.898689 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.934159 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.964090 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.964127 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.964135 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.964158 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.964168 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:53Z","lastTransitionTime":"2026-01-21T09:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:53 crc kubenswrapper[4618]: I0121 09:03:53.972904 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.011927 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:54Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.065138 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.065185 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.065195 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.065208 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.065216 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.167157 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.167187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.167196 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.167207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.167216 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.268711 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.268738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.268746 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.268755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.268762 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.370127 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.370169 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.370178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.370189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.370196 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.472086 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.472129 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.472138 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.472164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.472173 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.527385 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:20:12.42295629 +0000 UTC Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.573702 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.573723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.573730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.573739 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.573746 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.675323 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.675444 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.675506 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.675575 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.675638 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.777733 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.777758 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.777766 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.777777 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.777784 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.879487 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.879514 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.879522 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.879533 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.879541 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.980759 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.980958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.980967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.980977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.980984 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:54Z","lastTransitionTime":"2026-01-21T09:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.998554 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx"] Jan 21 09:03:54 crc kubenswrapper[4618]: I0121 09:03:54.999203 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.001031 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.002162 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.008204 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.016385 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.023726 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.030329 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.041915 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.049759 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.057786 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.064759 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.070965 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.078321 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.082740 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.082775 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.082784 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.082797 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.082806 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.087419 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.093221 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.093263 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cktk\" (UniqueName: \"kubernetes.io/projected/d2cb204f-4bf6-441d-95e5-b8fb8644948d-kube-api-access-4cktk\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.093296 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.093314 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.094348 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.105989 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.115555 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.122843 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.128914 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.184173 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.184202 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.184211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.184224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.184232 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.193664 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cktk\" (UniqueName: \"kubernetes.io/projected/d2cb204f-4bf6-441d-95e5-b8fb8644948d-kube-api-access-4cktk\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.193718 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.193747 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.193786 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.194358 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.194413 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.197885 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2cb204f-4bf6-441d-95e5-b8fb8644948d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.205668 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cktk\" (UniqueName: \"kubernetes.io/projected/d2cb204f-4bf6-441d-95e5-b8fb8644948d-kube-api-access-4cktk\") pod \"ovnkube-control-plane-749d76644c-pcpbx\" (UID: \"d2cb204f-4bf6-441d-95e5-b8fb8644948d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.285880 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.285925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.285934 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.285947 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.285955 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.294186 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.294260 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.294280 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:04:11.294261932 +0000 UTC m=+50.044729249 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.294312 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.294343 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.294380 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:11.294373232 +0000 UTC m=+50.044840549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.294426 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.294480 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:11.29446757 +0000 UTC m=+50.044934897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.308423 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" Jan 21 09:03:55 crc kubenswrapper[4618]: W0121 09:03:55.318064 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2cb204f_4bf6_441d_95e5_b8fb8644948d.slice/crio-4f90eb44e32045334a2f6c247846608ae651dd41ca253a7193c0d7bd06a7fd9b WatchSource:0}: Error finding container 4f90eb44e32045334a2f6c247846608ae651dd41ca253a7193c0d7bd06a7fd9b: Status 404 returned error can't find the container with id 4f90eb44e32045334a2f6c247846608ae651dd41ca253a7193c0d7bd06a7fd9b Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.387678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.387707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.387717 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.387730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.387737 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.394672 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.394723 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394798 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394819 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394829 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394862 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:11.394851703 +0000 UTC m=+50.145319020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394799 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394919 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394929 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.394960 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:11.394952034 +0000 UTC m=+50.145419351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.397694 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.398810 4618 scope.go:117] "RemoveContainer" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.398923 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.488977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.489013 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.489022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.489035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.489043 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.527771 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 23:03:22.256680127 +0000 UTC Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.537070 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.537080 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.537083 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.537231 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.537364 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:55 crc kubenswrapper[4618]: E0121 09:03:55.537428 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.590644 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.590665 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.590674 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.590685 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.590693 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.680628 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" event={"ID":"d2cb204f-4bf6-441d-95e5-b8fb8644948d","Type":"ContainerStarted","Data":"63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.680668 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" event={"ID":"d2cb204f-4bf6-441d-95e5-b8fb8644948d","Type":"ContainerStarted","Data":"7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.680677 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" event={"ID":"d2cb204f-4bf6-441d-95e5-b8fb8644948d","Type":"ContainerStarted","Data":"4f90eb44e32045334a2f6c247846608ae651dd41ca253a7193c0d7bd06a7fd9b"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.689855 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.691808 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.691833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.691842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.691852 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.691861 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.700015 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.707432 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.713811 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.722796 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.750368 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.763481 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.772813 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.782002 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.791048 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.793404 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.793438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.793448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.793461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.793470 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.799026 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.806162 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.813858 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.826040 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.834522 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.842858 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:55Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.895331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.895374 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.895383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.895397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.895408 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.997528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.997559 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.997567 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.997580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:55 crc kubenswrapper[4618]: I0121 09:03:55.997588 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:55Z","lastTransitionTime":"2026-01-21T09:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.099476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.099507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.099516 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.099529 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.099537 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.201415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.201450 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.201461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.201474 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.201483 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.303175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.303211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.303220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.303233 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.303243 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.404921 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.404950 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.404959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.404970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.404979 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.412798 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kpxzc"] Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.413161 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: E0121 09:03:56.413213 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.422733 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.431425 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.439749 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.460578 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.468691 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.476787 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.483908 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.490234 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.497455 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.503809 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbsg\" (UniqueName: \"kubernetes.io/projected/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-kube-api-access-hqbsg\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.503961 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.505050 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.506199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.506225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.506234 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.506246 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.506253 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.514259 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.520825 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.528489 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:05:15.360109197 +0000 UTC Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.532927 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.541371 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.549405 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.557514 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.565235 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:56Z is after 2025-08-24T17:21:41Z" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.605420 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.605456 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbsg\" (UniqueName: \"kubernetes.io/projected/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-kube-api-access-hqbsg\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: E0121 09:03:56.605544 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:56 crc kubenswrapper[4618]: E0121 09:03:56.605614 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:57.105596094 +0000 UTC m=+35.856063431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.608432 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.608457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.608467 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.608479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.608488 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.617978 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbsg\" (UniqueName: \"kubernetes.io/projected/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-kube-api-access-hqbsg\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.710236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.710263 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.710272 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.710283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.710292 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.812580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.812611 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.812620 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.812633 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.812642 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.914384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.914410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.914420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.914430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:56 crc kubenswrapper[4618]: I0121 09:03:56.914439 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:56Z","lastTransitionTime":"2026-01-21T09:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.016370 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.016398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.016406 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.016416 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.016423 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.109311 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:57 crc kubenswrapper[4618]: E0121 09:03:57.109445 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:57 crc kubenswrapper[4618]: E0121 09:03:57.109495 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:03:58.109483228 +0000 UTC m=+36.859950545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.118011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.118051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.118061 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.118072 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.118080 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.219692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.219725 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.219736 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.219749 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.219756 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.321425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.321645 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.321653 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.321667 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.321676 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.423018 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.423052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.423060 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.423074 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.423082 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.524863 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.524893 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.524901 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.524912 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.524921 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.529364 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 20:46:03.974154384 +0000 UTC Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.537027 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.537055 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.537104 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:57 crc kubenswrapper[4618]: E0121 09:03:57.537110 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:57 crc kubenswrapper[4618]: E0121 09:03:57.537193 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:57 crc kubenswrapper[4618]: E0121 09:03:57.537295 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.626169 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.626194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.626202 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.626213 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.626219 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.728404 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.728431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.728439 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.728448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.728455 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.829744 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.829874 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.829937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.829995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.830058 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.931486 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.931513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.931521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.931534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:57 crc kubenswrapper[4618]: I0121 09:03:57.931542 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:57Z","lastTransitionTime":"2026-01-21T09:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.032834 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.032860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.032871 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.032881 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.032888 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.117698 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:58 crc kubenswrapper[4618]: E0121 09:03:58.117796 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:58 crc kubenswrapper[4618]: E0121 09:03:58.117976 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:00.117962771 +0000 UTC m=+38.868430088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.134261 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.134286 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.134294 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.134304 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.134312 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.236318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.236419 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.236487 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.236558 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.236620 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.338426 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.338529 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.338602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.338669 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.338725 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.439932 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.439950 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.439958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.439969 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.439977 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.530511 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:21:16.609872511 +0000 UTC Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.536929 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:03:58 crc kubenswrapper[4618]: E0121 09:03:58.537016 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.541498 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.541523 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.541531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.541561 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.541577 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.642991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.643016 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.643024 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.643033 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.643054 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.745001 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.745028 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.745036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.745045 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.745052 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.846632 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.846659 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.846666 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.846674 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.846681 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.948236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.948254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.948261 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.948271 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:58 crc kubenswrapper[4618]: I0121 09:03:58.948278 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:58Z","lastTransitionTime":"2026-01-21T09:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.049551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.049600 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.049610 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.049623 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.049631 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.150953 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.150974 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.150983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.150992 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.150999 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.252888 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.252915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.252924 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.252934 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.252942 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.354618 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.354738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.354809 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.354875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.354933 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.456870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.456899 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.456907 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.456916 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.456923 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.530716 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:14:58.611310955 +0000 UTC Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.536958 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:03:59 crc kubenswrapper[4618]: E0121 09:03:59.537047 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.537159 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.537222 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:03:59 crc kubenswrapper[4618]: E0121 09:03:59.537346 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:03:59 crc kubenswrapper[4618]: E0121 09:03:59.537414 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.558623 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.558726 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.558815 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.558896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.559103 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.660747 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.660774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.660782 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.660791 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.660799 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.762171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.762209 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.762219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.762229 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.762238 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.863460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.863619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.863698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.863761 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.863823 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.965120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.965167 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.965186 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.965196 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:03:59 crc kubenswrapper[4618]: I0121 09:03:59.965204 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:03:59Z","lastTransitionTime":"2026-01-21T09:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.066839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.066865 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.066873 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.066881 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.066888 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.133603 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:00 crc kubenswrapper[4618]: E0121 09:04:00.133749 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:00 crc kubenswrapper[4618]: E0121 09:04:00.133806 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:04.133791109 +0000 UTC m=+42.884258426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.168575 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.168610 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.168619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.168632 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.168641 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.270117 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.270159 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.270168 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.270178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.270197 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.371792 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.371825 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.371834 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.371846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.371854 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.473959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.473987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.473996 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.474009 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.474021 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.531414 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 18:47:18.659168647 +0000 UTC Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.536826 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:00 crc kubenswrapper[4618]: E0121 09:04:00.536940 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.575030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.575059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.575067 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.575077 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.575084 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.677020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.677050 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.677058 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.677068 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.677077 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.778958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.778985 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.778993 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.779003 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.779009 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.880348 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.880376 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.880383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.880393 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.880400 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.982472 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.982513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.982521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.982532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:00 crc kubenswrapper[4618]: I0121 09:04:00.982539 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:00Z","lastTransitionTime":"2026-01-21T09:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.084341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.084379 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.084389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.084401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.084412 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.185798 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.185835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.185846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.185858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.185866 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.287492 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.287522 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.287531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.287544 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.287552 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.388977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.389005 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.389017 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.389056 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.389065 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.490992 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.491026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.491035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.491044 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.491050 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.532348 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 22:19:00.300146479 +0000 UTC Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.537616 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.537639 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:01 crc kubenswrapper[4618]: E0121 09:04:01.537696 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.537752 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:01 crc kubenswrapper[4618]: E0121 09:04:01.537783 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:01 crc kubenswrapper[4618]: E0121 09:04:01.537894 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.546174 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.555778 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.562717 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.569457 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.581694 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.589998 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.592338 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.592363 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.592371 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.592382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.592390 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.598976 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.607090 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.614930 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.622229 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.628656 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.635452 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.641641 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.653670 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.661217 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.668696 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.676647 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.693727 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.693757 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.693765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.693777 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.693785 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.795209 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.795237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.795244 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.795254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.795264 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.898658 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.898688 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.898696 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.898710 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.898719 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.999866 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.999896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.999904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.999918 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:01 crc kubenswrapper[4618]: I0121 09:04:01.999926 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:01Z","lastTransitionTime":"2026-01-21T09:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.101473 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.101496 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.101505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.101514 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.101521 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.203051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.203082 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.203092 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.203104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.203113 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.274860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.274894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.274902 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.274915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.274923 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.283398 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:02Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.285869 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.285896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.285906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.285932 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.285940 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.295805 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:02Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.297983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.298010 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.298019 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.298028 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.298035 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.305836 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:02Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.307734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.307758 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.307765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.307774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.307781 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.315118 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:02Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.317263 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.317295 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.317303 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.317315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.317323 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.324830 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:02Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.324950 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.325994 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.326036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.326046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.326056 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.326063 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.427945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.427968 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.427976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.427986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.427994 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.529507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.529536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.529544 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.529555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.529563 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.532763 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 10:27:27.923483258 +0000 UTC Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.537020 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:02 crc kubenswrapper[4618]: E0121 09:04:02.537168 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.631076 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.631107 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.631128 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.631177 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.631187 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.732665 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.732691 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.732699 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.732708 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.732715 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.834391 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.834602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.834609 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.834617 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.834624 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.936503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.936541 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.936551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.936565 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:02 crc kubenswrapper[4618]: I0121 09:04:02.936574 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:02Z","lastTransitionTime":"2026-01-21T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.038389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.038414 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.038441 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.038451 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.038459 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.139806 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.139842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.139854 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.139864 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.139872 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.241716 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.241739 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.241746 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.241755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.241762 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.342940 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.342990 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.342999 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.343008 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.343017 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.444792 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.444845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.444853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.444864 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.444871 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.533760 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:57:05.47815809 +0000 UTC Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.537013 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.537037 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.537056 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:03 crc kubenswrapper[4618]: E0121 09:04:03.537102 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:03 crc kubenswrapper[4618]: E0121 09:04:03.537193 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:03 crc kubenswrapper[4618]: E0121 09:04:03.537273 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.546674 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.546707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.546716 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.546728 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.546737 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.648979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.649007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.649015 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.649026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.649035 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.750940 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.750964 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.750973 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.750983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.750991 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.852427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.852464 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.852473 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.852488 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.852495 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.954801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.954829 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.954839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.954849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:03 crc kubenswrapper[4618]: I0121 09:04:03.954857 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:03Z","lastTransitionTime":"2026-01-21T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.056744 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.056779 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.056790 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.056803 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.056811 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.158359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.158398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.158406 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.158418 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.158428 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.164688 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:04 crc kubenswrapper[4618]: E0121 09:04:04.164815 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:04 crc kubenswrapper[4618]: E0121 09:04:04.164871 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:12.164850756 +0000 UTC m=+50.915318073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.259830 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.259872 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.259884 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.259901 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.259911 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.361803 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.361832 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.361840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.361851 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.361858 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.464171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.464204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.464212 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.464223 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.464248 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.533914 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 00:36:14.064426805 +0000 UTC Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.537170 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:04 crc kubenswrapper[4618]: E0121 09:04:04.537279 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.566157 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.566184 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.566193 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.566203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.566211 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.668318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.668345 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.668352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.668362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.668368 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.770224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.770260 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.770268 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.770278 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.770286 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.872051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.872080 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.872089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.872100 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.872108 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.973177 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.973207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.973215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.973226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:04 crc kubenswrapper[4618]: I0121 09:04:04.973242 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:04Z","lastTransitionTime":"2026-01-21T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.074525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.074550 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.074558 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.074569 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.074578 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.176354 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.176380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.176389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.176399 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.176406 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.277842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.277900 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.277909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.277919 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.277926 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.379572 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.379597 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.379605 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.379615 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.379622 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.481028 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.481054 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.481062 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.481071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.481078 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.534541 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:49:27.653923812 +0000 UTC Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.536767 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.536802 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.536844 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:05 crc kubenswrapper[4618]: E0121 09:04:05.536840 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:05 crc kubenswrapper[4618]: E0121 09:04:05.536896 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:05 crc kubenswrapper[4618]: E0121 09:04:05.536986 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.582200 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.582221 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.582230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.582238 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.582256 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.684322 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.684347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.684354 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.684363 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.684370 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.786501 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.786528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.786535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.786545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.786551 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.888230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.888263 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.888272 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.888283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.888291 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.990372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.990399 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.990406 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.990415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:05 crc kubenswrapper[4618]: I0121 09:04:05.990422 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:05Z","lastTransitionTime":"2026-01-21T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.091886 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.091922 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.091935 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.091951 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.091962 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.193903 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.193926 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.193935 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.193945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.193952 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.295623 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.295652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.295660 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.295672 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.295679 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.397127 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.397175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.397184 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.397192 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.397199 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.499319 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.499353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.499363 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.499377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.499388 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.534613 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:03:45.883593357 +0000 UTC Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.536914 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:06 crc kubenswrapper[4618]: E0121 09:04:06.537006 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.600678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.600701 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.600709 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.600717 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.600726 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.701818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.701841 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.701849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.701859 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.701866 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.803306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.803330 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.803338 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.803348 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.803355 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.904997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.905020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.905029 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.905038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:06 crc kubenswrapper[4618]: I0121 09:04:06.905045 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:06Z","lastTransitionTime":"2026-01-21T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.006423 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.006472 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.006490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.006505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.006517 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.107936 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.107962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.107970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.107978 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.107986 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.209918 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.209942 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.209950 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.209958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.209965 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.311782 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.311814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.311822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.311831 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.311838 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.413753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.413783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.413791 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.413800 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.413807 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.514975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.514995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.515004 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.515012 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.515020 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.535657 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:37:02.962460039 +0000 UTC Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.536817 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.536849 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:07 crc kubenswrapper[4618]: E0121 09:04:07.536887 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.537053 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:07 crc kubenswrapper[4618]: E0121 09:04:07.537116 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:07 crc kubenswrapper[4618]: E0121 09:04:07.537300 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.537320 4618 scope.go:117] "RemoveContainer" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.616084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.616112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.616122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.616133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.616182 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.707744 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/1.log" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.709801 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.710102 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.717773 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.717806 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.717817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.717829 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.717837 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.718383 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.724727 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.725586 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.729280 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.738879 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.748949 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.760369 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.767327 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.779500 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.788079 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.798003 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.807670 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.819572 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.819606 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.819614 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.819627 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.819650 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.821003 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.831415 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.839479 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.847188 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.853809 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.861559 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.870225 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.877282 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.884801 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.891596 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.905368 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.913869 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.921228 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.921258 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.921267 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.921287 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.921296 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:07Z","lastTransitionTime":"2026-01-21T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.927888 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.942087 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.950627 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.959523 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.967211 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.974284 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.982187 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:07 crc kubenswrapper[4618]: I0121 09:04:07.993608 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:07Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.001303 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.008820 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.016476 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023032 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023129 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023301 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023354 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.023998 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.034441 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.125347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.125377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.125386 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.125397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.125405 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.227011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.227046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.227057 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.227071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.227078 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.328489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.328524 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.328532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.328544 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.328553 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.430154 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.430184 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.430192 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.430204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.430213 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.531970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.531999 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.532007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.532016 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.532024 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.536352 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 20:39:29.470675378 +0000 UTC Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.537515 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:08 crc kubenswrapper[4618]: E0121 09:04:08.537601 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.633545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.633580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.633590 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.633602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.633612 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.712825 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/2.log" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.713271 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/1.log" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.715005 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" exitCode=1 Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.715025 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.715053 4618 scope.go:117] "RemoveContainer" containerID="9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.715429 4618 scope.go:117] "RemoveContainer" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" Jan 21 09:04:08 crc kubenswrapper[4618]: E0121 09:04:08.715550 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.725047 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.733916 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.734706 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.734731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.734757 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.734771 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.734779 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.743081 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.750974 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.758536 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.765477 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.775459 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.787525 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c69bd964fd0817995e9a81a1dffb58cc0e0a9d6818012dfd072a13c2621522d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:03:52Z\\\",\\\"message\\\":\\\"er during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:03:52Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:03:52.251686 6073 services_controller.go:434] Service openshift-controller-manager/controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{controller-manager openshift-controller-manager bec3404d-8a9b-42cf-8577-99faf17d6a73 4118 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-controller-manager] map[operator.openshift.io/spec-hash:b3b96749ab82e4de02ef6aa9f0e168108d09315e18d73931c12251d267378e74 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.795429 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.803312 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.811960 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.822247 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.830859 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.836394 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.836417 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.836425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.836438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.836445 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.838291 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.845690 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.852406 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.864520 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.877560 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.937903 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.937933 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.937941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.937954 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:08 crc kubenswrapper[4618]: I0121 09:04:08.937962 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:08Z","lastTransitionTime":"2026-01-21T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.039349 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.039377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.039385 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.039395 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.039403 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.141184 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.141219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.141227 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.141240 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.141248 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.243516 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.243549 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.243560 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.243571 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.243579 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.345167 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.345210 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.345219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.345233 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.345244 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.447453 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.447485 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.447496 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.447507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.447516 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.536851 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:09 crc kubenswrapper[4618]: E0121 09:04:09.536929 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.536961 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:29:19.524091178 +0000 UTC Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.536981 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.537035 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:09 crc kubenswrapper[4618]: E0121 09:04:09.537079 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:09 crc kubenswrapper[4618]: E0121 09:04:09.537219 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.549026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.549062 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.549072 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.549084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.549093 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.651557 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.651598 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.651607 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.651619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.651626 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.718456 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/2.log" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.720888 4618 scope.go:117] "RemoveContainer" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" Jan 21 09:04:09 crc kubenswrapper[4618]: E0121 09:04:09.721010 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.729492 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.737138 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.745535 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753074 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753095 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753104 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.753580 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.760651 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.766564 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.774103 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.785631 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.793133 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.800701 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.808384 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.819969 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.827579 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.833604 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.840355 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.847211 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.854855 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.854885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.854894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.854908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.854916 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.867364 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.882152 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:09Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.956913 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.956946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.956955 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.956969 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:09 crc kubenswrapper[4618]: I0121 09:04:09.956978 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:09Z","lastTransitionTime":"2026-01-21T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.059208 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.059237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.059245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.059257 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.059264 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.160934 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.160959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.160967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.160976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.160983 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.262421 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.262452 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.262461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.262471 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.262482 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.363783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.363811 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.363818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.363828 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.363836 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.465285 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.465328 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.465337 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.465349 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.465357 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.537364 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.537370 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:51:40.13903419 +0000 UTC Jan 21 09:04:10 crc kubenswrapper[4618]: E0121 09:04:10.537455 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.566794 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.566821 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.566830 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.566839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.566846 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.668734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.668755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.668763 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.668771 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.668778 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.770174 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.770238 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.770247 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.770259 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.770268 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.871593 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.871621 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.871629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.871639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.871646 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.973564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.973595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.973604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.973615 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:10 crc kubenswrapper[4618]: I0121 09:04:10.973623 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:10Z","lastTransitionTime":"2026-01-21T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.075653 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.075682 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.075690 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.075700 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.075707 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.177437 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.177468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.177477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.177487 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.177495 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.278822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.278845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.278853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.278865 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.278872 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.320381 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.320441 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.320475 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.320509 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:04:43.32049304 +0000 UTC m=+82.070960357 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.320540 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.320575 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:43.320563013 +0000 UTC m=+82.071030329 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.320594 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.320632 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:43.320623467 +0000 UTC m=+82.071090784 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.381645 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.381665 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.381677 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.381687 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.381695 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.421737 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.421764 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421836 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421837 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421866 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421877 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421909 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:43.421899197 +0000 UTC m=+82.172366524 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421847 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421951 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.421977 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:43.421968468 +0000 UTC m=+82.172435784 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.483399 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.483430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.483457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.483466 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.483473 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.537327 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.537398 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.537468 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.537465 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 06:41:56.085339216 +0000 UTC Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.537494 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.537644 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:11 crc kubenswrapper[4618]: E0121 09:04:11.537706 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.546854 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.554242 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.562115 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.569714 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.577253 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584669 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584727 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584749 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584759 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.584766 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.591992 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.598208 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.608132 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.619812 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.627505 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.636192 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.648736 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.656977 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.664255 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.670192 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.676921 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.683364 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:11Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.686413 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.686438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.686448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.686461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.686469 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.788383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.788407 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.788415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.788425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.788432 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.890308 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.890342 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.890350 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.890361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.890369 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.991999 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.992096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.992183 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.992248 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:11 crc kubenswrapper[4618]: I0121 09:04:11.992314 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:11Z","lastTransitionTime":"2026-01-21T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.093565 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.093681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.093745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.093799 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.093851 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.196020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.196052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.196060 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.196073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.196081 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.227490 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.227586 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.227627 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:04:28.227615951 +0000 UTC m=+66.978083268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.298166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.298202 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.298211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.298224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.298235 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.399941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.399968 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.399977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.400002 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.400011 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.501780 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.501854 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.501871 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.501889 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.501931 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.537574 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 17:08:55.368076879 +0000 UTC Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.537653 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.537726 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.543452 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.543479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.543490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.543500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.543507 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.551044 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:12Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.553035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.553062 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.553070 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.553096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.553104 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.560462 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:12Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.562209 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.562234 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.562242 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.562255 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.562263 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.570360 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:12Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.573122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.573237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.573295 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.573359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.573428 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.581391 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:12Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.583351 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.583378 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.583387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.583397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.583406 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.591545 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:12Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:12 crc kubenswrapper[4618]: E0121 09:04:12.591645 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.603722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.603743 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.603750 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.603758 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.603765 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.705869 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.705895 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.705903 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.705912 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.705920 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.807511 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.807538 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.807546 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.807555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.807572 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.909465 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.909500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.909509 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.909518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:12 crc kubenswrapper[4618]: I0121 09:04:12.909525 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:12Z","lastTransitionTime":"2026-01-21T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.011297 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.011361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.011371 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.011384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.011392 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.112627 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.112644 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.112653 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.112662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.112670 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.214259 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.214287 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.214297 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.214328 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.214345 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.315855 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.315886 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.315899 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.315909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.315916 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.417289 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.417400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.417456 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.417525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.417588 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.518958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.518979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.518988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.518997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.519005 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.537415 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:13 crc kubenswrapper[4618]: E0121 09:04:13.537499 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.537580 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.537680 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.537751 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:15:15.597370818 +0000 UTC Jan 21 09:04:13 crc kubenswrapper[4618]: E0121 09:04:13.537725 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:13 crc kubenswrapper[4618]: E0121 09:04:13.538229 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.620376 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.620398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.620406 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.620415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.620424 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.722219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.722246 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.722254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.722264 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.722272 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.824133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.824177 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.824186 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.824194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.824202 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.925480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.925597 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.925653 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.925703 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:13 crc kubenswrapper[4618]: I0121 09:04:13.925750 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:13Z","lastTransitionTime":"2026-01-21T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.027022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.027052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.027061 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.027073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.027081 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.128774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.128811 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.128820 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.128834 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.128843 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.230559 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.230595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.230605 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.230616 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.230624 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.332006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.332036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.332044 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.332055 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.332063 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.433604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.433636 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.433646 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.433657 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.433666 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.535486 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.535517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.535527 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.535555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.535564 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.536748 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:14 crc kubenswrapper[4618]: E0121 09:04:14.536851 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.537847 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 19:41:38.999675064 +0000 UTC Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.637541 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.637636 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.637767 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.637827 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.637877 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.739860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.739887 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.739894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.739903 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.739927 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.841341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.841381 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.841389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.841401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.841408 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.943163 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.943276 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.943381 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.943436 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:14 crc kubenswrapper[4618]: I0121 09:04:14.943461 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:14Z","lastTransitionTime":"2026-01-21T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.045250 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.045276 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.045285 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.045295 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.045302 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.147131 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.147185 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.147193 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.147204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.147212 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.249108 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.249137 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.249164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.249175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.249183 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.350561 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.350588 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.350595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.350604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.350611 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.451792 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.451818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.451829 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.451838 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.451846 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.536785 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.536838 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:15 crc kubenswrapper[4618]: E0121 09:04:15.536899 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.536921 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:15 crc kubenswrapper[4618]: E0121 09:04:15.537021 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:15 crc kubenswrapper[4618]: E0121 09:04:15.537095 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.537982 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 20:26:59.490629003 +0000 UTC Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.553296 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.553322 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.553331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.553340 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.553348 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.654889 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.654917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.654926 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.654938 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.654948 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.756994 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.757061 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.757071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.757085 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.757096 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.859047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.859093 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.859104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.859114 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.859122 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.961495 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.961526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.961534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.961545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:15 crc kubenswrapper[4618]: I0121 09:04:15.961554 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:15Z","lastTransitionTime":"2026-01-21T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.063047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.063070 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.063079 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.063089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.063096 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.164993 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.165025 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.165034 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.165045 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.165053 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.267051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.267085 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.267094 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.267107 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.267117 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.368261 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.368286 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.368295 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.368304 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.368311 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.469727 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.469816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.469831 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.469841 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.469848 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.537528 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:16 crc kubenswrapper[4618]: E0121 09:04:16.537738 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.538133 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:09:41.183649103 +0000 UTC Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.571230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.571292 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.571301 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.571312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.571319 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.672978 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.673026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.673041 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.673059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.673125 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.774594 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.774622 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.774631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.774644 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.774652 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.876290 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.876315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.876323 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.876333 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.876340 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.978322 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.978355 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.978367 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.978389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:16 crc kubenswrapper[4618]: I0121 09:04:16.978399 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:16Z","lastTransitionTime":"2026-01-21T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.080185 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.080207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.080217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.080228 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.080234 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.182708 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.182746 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.182755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.182766 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.182775 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.284714 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.284743 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.284751 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.284762 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.284771 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.386180 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.386203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.386211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.386222 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.386231 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.488304 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.488332 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.488365 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.488377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.488393 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.536788 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.536826 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:17 crc kubenswrapper[4618]: E0121 09:04:17.536868 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.536923 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:17 crc kubenswrapper[4618]: E0121 09:04:17.537057 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:17 crc kubenswrapper[4618]: E0121 09:04:17.537165 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.538566 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 13:19:40.784201578 +0000 UTC Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.590205 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.590239 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.590248 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.590260 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.590271 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.692375 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.692416 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.692425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.692439 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.692448 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.794324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.794357 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.794366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.794378 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.794398 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.895720 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.895797 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.895824 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.895837 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.895845 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.997624 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.997662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.997670 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.997680 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:17 crc kubenswrapper[4618]: I0121 09:04:17.997687 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:17Z","lastTransitionTime":"2026-01-21T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.099567 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.099689 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.099745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.099816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.099867 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.202093 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.202214 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.202273 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.202357 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.202438 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.304489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.304513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.304521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.304531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.304555 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.406208 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.406315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.406410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.406477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.406532 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.507746 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.507774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.507783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.507795 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.507804 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.537247 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:18 crc kubenswrapper[4618]: E0121 09:04:18.537360 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.539321 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 05:11:48.896758508 +0000 UTC Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.609874 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.609907 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.609915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.609925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.609934 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.711798 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.711820 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.711828 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.711837 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.711845 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.813890 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.813918 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.813927 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.813937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.813946 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.915448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.915482 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.915491 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.915503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:18 crc kubenswrapper[4618]: I0121 09:04:18.915511 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:18Z","lastTransitionTime":"2026-01-21T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.017595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.017637 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.017646 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.017662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.017670 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.119101 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.119164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.119175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.119187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.119196 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.220538 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.220562 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.220570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.220580 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.220621 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.322217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.322236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.322245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.322254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.322262 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.424377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.424401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.424419 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.424430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.424438 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.525811 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.525849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.525857 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.525867 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.525875 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.537118 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.537191 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:19 crc kubenswrapper[4618]: E0121 09:04:19.537248 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.537124 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:19 crc kubenswrapper[4618]: E0121 09:04:19.537470 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:19 crc kubenswrapper[4618]: E0121 09:04:19.537560 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.539487 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 19:31:04.663850416 +0000 UTC Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.627379 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.627400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.627416 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.627427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.627438 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.729059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.729087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.729096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.729106 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.729113 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.830540 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.830572 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.830581 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.830592 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.830600 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.932328 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.932357 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.932366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.932378 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:19 crc kubenswrapper[4618]: I0121 09:04:19.932387 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:19Z","lastTransitionTime":"2026-01-21T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.033944 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.033979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.033988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.034001 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.034010 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.135451 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.135502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.135510 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.135522 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.135531 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.237021 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.237046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.237054 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.237063 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.237071 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.338591 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.338642 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.338651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.338660 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.338667 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.440358 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.440396 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.440407 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.440427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.440435 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.537797 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:20 crc kubenswrapper[4618]: E0121 09:04:20.537892 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.539876 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 21:35:50.981114867 +0000 UTC Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.542048 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.542078 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.542089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.542099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.542106 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.643764 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.643797 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.643805 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.643817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.643825 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.745502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.745528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.745536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.745546 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.745553 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.847170 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.847215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.847224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.847237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.847246 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.948818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.948852 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.948861 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.948873 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:20 crc kubenswrapper[4618]: I0121 09:04:20.948883 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:20Z","lastTransitionTime":"2026-01-21T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.050785 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.050816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.050825 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.050835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.050842 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.152986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.153018 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.153026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.153038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.153046 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.254812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.254845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.254853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.254868 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.254876 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.356541 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.356589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.356597 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.356608 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.356616 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.457873 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.457906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.457914 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.457928 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.457936 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.537277 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.537322 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:21 crc kubenswrapper[4618]: E0121 09:04:21.537400 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.537455 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:21 crc kubenswrapper[4618]: E0121 09:04:21.537798 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:21 crc kubenswrapper[4618]: E0121 09:04:21.537959 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.537976 4618 scope.go:117] "RemoveContainer" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" Jan 21 09:04:21 crc kubenswrapper[4618]: E0121 09:04:21.538255 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.540104 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 21:15:47.785263345 +0000 UTC Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.546981 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.553987 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.559231 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.559256 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.559276 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.559288 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.559296 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.560808 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.573231 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.581700 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.589474 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.598916 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.611890 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.620819 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.626912 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.633490 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.640294 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.652507 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660466 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660872 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660916 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.660924 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.667933 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.675599 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.683724 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.690937 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:21Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.762904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.762933 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.762941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.762953 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.762962 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.865371 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.865400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.865408 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.865417 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.865425 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.966983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.967006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.967014 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.967024 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:21 crc kubenswrapper[4618]: I0121 09:04:21.967030 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:21Z","lastTransitionTime":"2026-01-21T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.068936 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.068972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.068981 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.068995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.069002 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.170604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.170649 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.170658 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.170670 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.170677 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.272579 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.272613 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.272624 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.272635 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.272645 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.375017 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.375043 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.375051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.375073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.375080 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.477262 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.477291 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.477300 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.477311 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.477320 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.536794 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.536899 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.540901 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:16:31.353186288 +0000 UTC Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.579505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.579533 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.579542 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.579553 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.579562 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.675305 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.675341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.675350 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.675362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.675372 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.683836 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:22Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.686059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.686091 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.686099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.686111 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.686119 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.694091 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:22Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.696103 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.696131 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.696155 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.696168 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.696177 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.703471 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:22Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.705269 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.705295 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.705303 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.705312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.705319 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.712555 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:22Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.714336 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.714364 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.714372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.714382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.714388 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.721622 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:22Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:22 crc kubenswrapper[4618]: E0121 09:04:22.721723 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.722735 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.722763 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.722772 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.722781 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.722787 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.824368 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.824392 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.824400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.824410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.824417 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.926192 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.926225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.926234 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.926243 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:22 crc kubenswrapper[4618]: I0121 09:04:22.926252 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:22Z","lastTransitionTime":"2026-01-21T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.027330 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.027365 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.027374 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.027387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.027395 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.129422 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.129461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.129469 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.129480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.129488 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.231556 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.231588 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.231597 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.231629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.231637 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.333347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.333379 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.333387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.333398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.333405 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.435036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.435066 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.435077 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.435087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.435095 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536560 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536571 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536579 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536737 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:23 crc kubenswrapper[4618]: E0121 09:04:23.536813 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536856 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.536908 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:23 crc kubenswrapper[4618]: E0121 09:04:23.536933 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:23 crc kubenswrapper[4618]: E0121 09:04:23.536999 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.541646 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 09:10:07.012343463 +0000 UTC Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.638629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.638656 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.638664 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.638676 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.638687 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.740306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.740331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.740339 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.740348 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.740365 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.842677 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.842712 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.842721 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.842731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.842742 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.944127 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.944168 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.944178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.944189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:23 crc kubenswrapper[4618]: I0121 09:04:23.944197 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:23Z","lastTransitionTime":"2026-01-21T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.045652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.045673 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.045681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.045690 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.045697 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.147244 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.147270 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.147279 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.147287 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.147295 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.249372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.249397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.249405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.249414 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.249421 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.350917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.350962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.350971 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.350985 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.350994 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.453052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.453077 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.453085 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.453094 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.453102 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.537716 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:24 crc kubenswrapper[4618]: E0121 09:04:24.537808 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.542314 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:31:50.217157247 +0000 UTC Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.554377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.554397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.554405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.554414 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.554421 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.656333 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.656366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.656374 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.656388 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.656397 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.758477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.758508 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.758516 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.758526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.758534 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.860745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.860774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.860783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.860792 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.860799 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.962196 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.962225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.962233 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.962245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:24 crc kubenswrapper[4618]: I0121 09:04:24.962255 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:24Z","lastTransitionTime":"2026-01-21T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.064267 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.064299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.064320 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.064333 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.064341 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.165909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.165936 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.165945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.165956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.165966 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.267290 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.267317 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.267325 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.267337 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.267345 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.369229 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.369257 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.369296 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.369306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.369314 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.470434 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.470505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.470515 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.470525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.470532 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.537533 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.537568 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.537536 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:25 crc kubenswrapper[4618]: E0121 09:04:25.537645 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:25 crc kubenswrapper[4618]: E0121 09:04:25.537725 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:25 crc kubenswrapper[4618]: E0121 09:04:25.537800 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.543235 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 03:22:18.396517297 +0000 UTC Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.572292 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.572318 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.572327 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.572336 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.572344 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.674168 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.674203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.674211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.674226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.674235 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.775515 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.775535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.775543 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.775552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.775562 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.876595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.876623 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.876631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.876640 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.876647 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.978783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.978816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.978823 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.978836 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:25 crc kubenswrapper[4618]: I0121 09:04:25.978845 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:25Z","lastTransitionTime":"2026-01-21T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.080510 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.080537 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.080545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.080555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.080563 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.182293 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.182319 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.182327 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.182338 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.182346 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.284356 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.284380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.284388 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.284398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.284414 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.386302 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.386337 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.386347 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.386361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.386370 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.487711 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.487728 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.487736 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.487745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.487753 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.537445 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:26 crc kubenswrapper[4618]: E0121 09:04:26.537530 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.544173 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:57:17.613341831 +0000 UTC Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.589668 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.589690 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.589698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.589707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.589715 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.691593 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.691623 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.691631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.691640 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.691648 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.792928 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.792955 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.792963 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.792972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.792979 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.894972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.895011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.895020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.895031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.895039 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.996521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.996579 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.996589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.996602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:26 crc kubenswrapper[4618]: I0121 09:04:26.996611 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:26Z","lastTransitionTime":"2026-01-21T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.098705 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.098734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.098743 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.098754 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.098763 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.200620 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.200649 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.200656 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.200667 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.200675 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.302256 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.302283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.302291 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.302300 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.302307 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.403728 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.403755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.403763 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.403774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.403782 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.504787 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.504817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.504825 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.504837 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.504845 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.537255 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.537297 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:27 crc kubenswrapper[4618]: E0121 09:04:27.537354 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:27 crc kubenswrapper[4618]: E0121 09:04:27.537441 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.537521 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:27 crc kubenswrapper[4618]: E0121 09:04:27.537623 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.544992 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:49:10.448400928 +0000 UTC Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.606298 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.606331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.606339 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.606353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.606361 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.707801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.707828 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.707836 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.707846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.707853 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.809329 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.809360 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.809368 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.809380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.809389 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.911474 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.911512 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.911521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.911532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:27 crc kubenswrapper[4618]: I0121 09:04:27.911542 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:27Z","lastTransitionTime":"2026-01-21T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.013216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.013245 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.013254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.013264 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.013272 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.114539 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.114567 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.114576 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.114586 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.114593 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.216322 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.216502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.216582 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.216651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.216714 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.244816 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:28 crc kubenswrapper[4618]: E0121 09:04:28.244963 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:28 crc kubenswrapper[4618]: E0121 09:04:28.245020 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:00.245005522 +0000 UTC m=+98.995472839 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.318014 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.318041 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.318050 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.318061 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.318071 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.419476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.419502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.419525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.419536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.419544 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.521698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.521726 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.521735 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.521747 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.521756 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.537585 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:28 crc kubenswrapper[4618]: E0121 09:04:28.537684 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.545061 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 23:47:24.798013983 +0000 UTC Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.623583 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.623698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.623713 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.623722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.623730 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.725179 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.725213 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.725222 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.725234 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.725242 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.827320 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.827351 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.827360 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.827372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.827380 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.929503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.929547 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.929558 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.929567 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:28 crc kubenswrapper[4618]: I0121 09:04:28.929574 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:28Z","lastTransitionTime":"2026-01-21T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.030930 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.030955 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.030964 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.030980 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.030987 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.132377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.132403 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.132413 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.132423 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.132431 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.234385 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.234430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.234441 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.234465 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.234475 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.337534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.337564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.337574 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.337585 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.337594 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.439197 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.439225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.439233 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.439241 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.439249 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.536786 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:29 crc kubenswrapper[4618]: E0121 09:04:29.536884 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.536900 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:29 crc kubenswrapper[4618]: E0121 09:04:29.537172 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.537207 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:29 crc kubenswrapper[4618]: E0121 09:04:29.537309 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.540492 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.540519 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.540535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.540546 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.540556 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.545817 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 14:13:15.370809286 +0000 UTC Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.642191 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.642224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.642232 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.642270 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.642279 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.743470 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.743493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.743502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.743512 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.743536 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.760599 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/0.log" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.760658 4618 generic.go:334] "Generic (PLEG): container finished" podID="052a66c4-94ce-4336-93f6-1d0023e58cc4" containerID="79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba" exitCode=1 Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.760691 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerDied","Data":"79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.761167 4618 scope.go:117] "RemoveContainer" containerID="79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.773874 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.784823 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.794508 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.803699 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.812183 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.820694 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.828465 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.841817 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.844864 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.844898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.844908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.844925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.844935 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.850247 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.858667 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.867132 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.876698 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.884852 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.892477 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.899394 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.906279 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.918008 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.926179 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:29Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.946675 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.946707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.946720 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.946734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:29 crc kubenswrapper[4618]: I0121 09:04:29.946743 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:29Z","lastTransitionTime":"2026-01-21T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.048835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.048958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.049019 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.049087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.049168 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.151299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.151520 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.151603 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.151676 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.151731 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.254009 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.254041 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.254049 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.254067 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.254076 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.356115 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.356158 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.356167 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.356183 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.356192 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.457991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.458020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.458030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.458041 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.458050 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.536846 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:30 crc kubenswrapper[4618]: E0121 09:04:30.537131 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.546785 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:11:10.593519436 +0000 UTC Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.559917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.560030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.560113 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.560215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.560293 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.661691 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.661815 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.661872 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.661940 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.662004 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.765685 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.765729 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.765739 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.765753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.765763 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.766957 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/0.log" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.767005 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerStarted","Data":"0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.777161 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.786006 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.793698 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.800819 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.807365 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.820057 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.827991 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.836183 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.844982 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.853696 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.861567 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.867112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.867153 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.867164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.867176 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.867184 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.869015 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.875651 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.882689 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.894490 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.902348 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.911284 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.919018 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:30Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.968696 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.968723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.968732 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.968745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:30 crc kubenswrapper[4618]: I0121 09:04:30.968753 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:30Z","lastTransitionTime":"2026-01-21T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.070199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.070227 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.070236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.070250 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.070258 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.171641 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.171689 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.171698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.171709 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.171717 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.273035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.273061 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.273069 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.273079 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.273108 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.374502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.374523 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.374531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.374551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.374558 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.475935 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.475966 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.475975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.475987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.475995 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.536836 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.536886 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:31 crc kubenswrapper[4618]: E0121 09:04:31.536953 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.536994 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:31 crc kubenswrapper[4618]: E0121 09:04:31.537075 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:31 crc kubenswrapper[4618]: E0121 09:04:31.537178 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.545726 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.546937 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 04:17:23.680752766 +0000 UTC Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.553788 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.565878 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.574387 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.576887 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.576910 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.576921 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.576930 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.576937 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.582369 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.589741 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.597953 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.608329 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.614549 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.621776 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.628266 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.640518 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.651087 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.660374 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.667971 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.675937 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.678166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.678195 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.678205 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.678216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.678225 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.685636 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.693377 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:31Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.779232 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.779257 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.779266 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.779277 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.779285 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.881502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.881521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.881528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.881539 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.881556 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.982944 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.983009 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.983019 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.983032 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:31 crc kubenswrapper[4618]: I0121 09:04:31.983041 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:31Z","lastTransitionTime":"2026-01-21T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.085058 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.085081 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.085089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.085099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.085106 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.186188 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.186442 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.186532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.186642 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.186721 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.288688 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.288717 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.288725 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.288737 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.288748 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.390443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.390459 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.390467 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.390477 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.390484 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.492203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.492392 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.492457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.492532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.492604 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.537642 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.537734 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.547889 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 00:03:07.09517981 +0000 UTC Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.594460 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.594483 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.594491 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.594503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.594511 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.696273 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.696299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.696307 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.696315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.696321 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.797876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.797900 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.797908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.797917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.797924 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.846366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.846390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.846415 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.846425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.846432 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.855062 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:32Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.857169 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.857199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.857207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.857220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.857229 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.864928 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:32Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.866775 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.866797 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.866806 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.866815 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.866822 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.874736 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:32Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.877214 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.877257 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.877268 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.877281 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.877291 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.885194 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:32Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.890197 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.890309 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.890379 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.890443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.890500 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.898879 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:32Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:32 crc kubenswrapper[4618]: E0121 09:04:32.899104 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.900051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.900153 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.900219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.900284 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:32 crc kubenswrapper[4618]: I0121 09:04:32.900342 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:32Z","lastTransitionTime":"2026-01-21T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.002120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.002297 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.002379 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.002459 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.002526 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.104790 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.104927 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.105006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.105073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.105152 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.207190 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.207306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.207371 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.207437 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.207501 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.308809 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.308838 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.308847 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.308858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.308866 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.410644 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.410678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.410686 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.410697 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.410705 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.512723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.512751 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.512759 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.512770 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.512779 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.537503 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.537544 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.537507 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:33 crc kubenswrapper[4618]: E0121 09:04:33.537615 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:33 crc kubenswrapper[4618]: E0121 09:04:33.537687 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:33 crc kubenswrapper[4618]: E0121 09:04:33.537787 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.548595 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 21:31:58.59012574 +0000 UTC Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.614258 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.614285 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.614293 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.614302 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.614309 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.715972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.716004 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.716012 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.716028 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.716036 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.817370 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.817393 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.817405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.817417 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.817425 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.919045 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.919081 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.919089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.919101 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:33 crc kubenswrapper[4618]: I0121 09:04:33.919112 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:33Z","lastTransitionTime":"2026-01-21T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.020021 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.020047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.020055 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.020066 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.020074 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.122536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.122577 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.122586 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.122596 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.122603 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.224496 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.224552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.224570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.224579 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.224587 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.326283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.326422 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.326490 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.326569 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.326626 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.427966 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.427994 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.428002 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.428013 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.428019 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.529562 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.529703 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.529774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.529833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.529881 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.536834 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:34 crc kubenswrapper[4618]: E0121 09:04:34.536917 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.549049 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 05:21:57.281765584 +0000 UTC Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.631916 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.631938 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.631946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.631956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.631966 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.733359 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.733400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.733411 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.733427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.733439 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.835129 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.835174 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.835183 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.835194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.835201 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.936799 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.936830 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.936839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.936849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:34 crc kubenswrapper[4618]: I0121 09:04:34.936856 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:34Z","lastTransitionTime":"2026-01-21T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.038223 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.038252 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.038260 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.038270 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.038278 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.140052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.140078 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.140086 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.140096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.140103 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.241714 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.241756 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.241766 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.241784 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.241793 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.342991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.343025 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.343034 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.343044 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.343053 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.444507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.444532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.444541 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.444551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.444559 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.537285 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.537326 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.537395 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:35 crc kubenswrapper[4618]: E0121 09:04:35.537394 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:35 crc kubenswrapper[4618]: E0121 09:04:35.537498 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:35 crc kubenswrapper[4618]: E0121 09:04:35.537622 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.546289 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.546315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.546323 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.546348 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.546357 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.549568 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 20:29:58.496707542 +0000 UTC Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.647612 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.647634 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.647642 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.647652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.647659 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.749020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.749049 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.749058 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.749067 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.749075 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.850809 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.850834 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.850842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.850852 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.850859 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.951870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.951894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.951904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.951915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:35 crc kubenswrapper[4618]: I0121 09:04:35.951923 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:35Z","lastTransitionTime":"2026-01-21T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.053371 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.053404 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.053412 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.053430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.053440 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.155229 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.155256 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.155268 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.155283 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.155292 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.256904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.256952 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.256966 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.256982 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.256994 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.358427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.358459 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.358468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.358479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.358489 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.459390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.459405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.459412 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.459420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.459427 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.537553 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:36 crc kubenswrapper[4618]: E0121 09:04:36.537655 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.538122 4618 scope.go:117] "RemoveContainer" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.550633 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:28:12.116894187 +0000 UTC Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.561085 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.561108 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.561117 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.561128 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.561153 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.663080 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.663255 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.663264 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.663277 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.663286 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.764939 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.764975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.764984 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.764998 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.765007 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.779537 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/2.log" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.781506 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.781847 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.795681 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.804135 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.812176 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.818936 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.826470 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.838533 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.847624 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.857338 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.867235 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.867266 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.867274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.867287 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.867295 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.869848 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.877859 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.885400 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.892107 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.899675 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.917929 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.929930 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.938287 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.947543 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.956022 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:36Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.969462 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.969491 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.969500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.969513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:36 crc kubenswrapper[4618]: I0121 09:04:36.969522 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:36Z","lastTransitionTime":"2026-01-21T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.071428 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.071459 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.071468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.071479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.071488 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.172898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.172949 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.172959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.172973 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.172981 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.274742 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.274764 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.274773 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.274785 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.274793 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.376362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.376390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.376400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.376431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.376438 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.477849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.477887 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.477896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.477905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.477912 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.537154 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.537193 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.537216 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:37 crc kubenswrapper[4618]: E0121 09:04:37.537295 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:37 crc kubenswrapper[4618]: E0121 09:04:37.537362 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:37 crc kubenswrapper[4618]: E0121 09:04:37.537418 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.551578 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 11:52:24.787354596 +0000 UTC Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.579595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.579620 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.579629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.579639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.579647 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.681390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.681429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.681442 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.681455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.681466 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.783342 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.783365 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.783372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.783382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.783390 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.785195 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/3.log" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.785704 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/2.log" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.787433 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" exitCode=1 Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.787459 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.787483 4618 scope.go:117] "RemoveContainer" containerID="6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.787918 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:04:37 crc kubenswrapper[4618]: E0121 09:04:37.788033 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.797671 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.804283 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.811748 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.818620 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.830965 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.839776 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.847382 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.855777 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.864033 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.871088 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.877833 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.884007 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.885237 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.885265 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.885276 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.885289 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.885298 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.891048 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.901779 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fc037e9e5cb0375b07bc64901a915507aa6431089d71c6972668eb9e278fe65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:08Z\\\",\\\"message\\\":\\\"te:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0121 09:04:08.130756 6320 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}\\\\nI0121 09:04:08.130779 6320 services_controller.go:360] Finished syncing service metrics on namespace openshift-authentication-operator for network=default : 1.305119ms\\\\nF0121 09:04:08.130780 6320 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:08Z is after 2025-08-24T17:21:41Z]\\\\nI0121 09:04:08.130792 6320 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator fo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:37Z\\\",\\\"message\\\":\\\"{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 09:04:37.117656 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0121 09:04:37.117660 6767 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0121 09:04:37.117633 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0121 09:04:37.117668 6767 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117672 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117674 6767 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0121 09:04:37.117676 6767 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0121 09:04:37.117681 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0121 09:04:37.117682 6767 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.909196 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.916495 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.924280 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.932950 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:37Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.987092 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.987120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.987129 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.987167 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:37 crc kubenswrapper[4618]: I0121 09:04:37.987177 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:37Z","lastTransitionTime":"2026-01-21T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.089005 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.089031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.089040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.089072 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.089080 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.190738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.190765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.190789 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.190800 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.190809 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.292703 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.292736 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.292744 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.292757 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.292765 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.394449 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.394503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.394514 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.394527 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.394535 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.496380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.496419 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.496428 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.496438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.496447 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.537703 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:38 crc kubenswrapper[4618]: E0121 09:04:38.537812 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.551913 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 10:52:16.324717002 +0000 UTC Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.598410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.598435 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.598443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.598453 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.598461 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.699940 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.699969 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.699979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.699991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.700000 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.790743 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/3.log" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.793194 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:04:38 crc kubenswrapper[4618]: E0121 09:04:38.793315 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.800955 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.800991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.801000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.801014 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.801023 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.802387 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.810749 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.818169 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.824795 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.831876 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.843745 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:37Z\\\",\\\"message\\\":\\\"{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 09:04:37.117656 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0121 09:04:37.117660 6767 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0121 09:04:37.117633 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0121 09:04:37.117668 6767 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117672 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117674 6767 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0121 09:04:37.117676 6767 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0121 09:04:37.117681 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0121 09:04:37.117682 6767 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.851227 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.859846 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.871677 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.879691 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.887994 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.894855 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.901853 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.902862 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.902889 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.902898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.902910 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.902918 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:38Z","lastTransitionTime":"2026-01-21T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.908500 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.916752 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.923875 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.931587 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:38 crc kubenswrapper[4618]: I0121 09:04:38.939479 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:38Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.004695 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.004726 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.004735 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.004748 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.004757 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.106463 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.106635 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.106699 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.106754 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.106809 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.208173 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.208196 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.208205 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.208215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.208222 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.310473 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.310507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.310519 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.310531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.310540 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.412413 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.412500 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.412564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.412631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.412681 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.514927 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.514957 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.514967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.514976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.514983 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.537547 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.537662 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.537626 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:39 crc kubenswrapper[4618]: E0121 09:04:39.537844 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:39 crc kubenswrapper[4618]: E0121 09:04:39.537927 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:39 crc kubenswrapper[4618]: E0121 09:04:39.538001 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.552419 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:56:58.67830791 +0000 UTC Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.616844 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.616892 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.616901 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.616910 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.616917 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.718176 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.718329 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.718394 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.718455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.718509 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.820220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.820254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.820266 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.820280 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.820290 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.922401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.922438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.922447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.922458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:39 crc kubenswrapper[4618]: I0121 09:04:39.922465 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:39Z","lastTransitionTime":"2026-01-21T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.024710 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.024741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.024750 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.024761 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.024770 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.126211 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.126241 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.126249 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.126258 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.126265 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.228475 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.228510 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.228518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.228530 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.228538 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.330685 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.330714 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.330722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.330734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.330744 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.431960 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.431991 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.432000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.432011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.432020 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.534377 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.534412 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.534420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.534434 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.534442 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.536913 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:40 crc kubenswrapper[4618]: E0121 09:04:40.537003 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.552846 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:13:53.593012533 +0000 UTC Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.636421 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.636450 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.636459 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.636475 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.636484 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.738189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.738207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.738217 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.738226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.738232 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.840048 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.840082 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.840090 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.840102 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.840113 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.941814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.941840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.941849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.941862 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:40 crc kubenswrapper[4618]: I0121 09:04:40.941870 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:40Z","lastTransitionTime":"2026-01-21T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.043543 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.043573 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.043581 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.043593 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.043601 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.145298 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.145324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.145332 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.145341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.145349 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.246468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.246497 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.246507 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.246520 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.246530 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.348261 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.348372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.348429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.348486 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.348541 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.450032 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.450085 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.450094 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.450106 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.450114 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.536788 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.536873 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:41 crc kubenswrapper[4618]: E0121 09:04:41.536990 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.537023 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:41 crc kubenswrapper[4618]: E0121 09:04:41.537101 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:41 crc kubenswrapper[4618]: E0121 09:04:41.537269 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.546072 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.551790 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.551822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.551833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.551845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.551853 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.553950 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:18:12.685813704 +0000 UTC Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.557894 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:37Z\\\",\\\"message\\\":\\\"{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 09:04:37.117656 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0121 09:04:37.117660 6767 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0121 09:04:37.117633 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0121 09:04:37.117668 6767 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117672 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117674 6767 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0121 09:04:37.117676 6767 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0121 09:04:37.117681 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0121 09:04:37.117682 6767 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.565416 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.572535 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.590422 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.607122 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.616296 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.625319 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.632965 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.639335 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.651784 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.653172 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.653198 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.653208 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.653254 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.653276 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.662706 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.670649 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.676814 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.685033 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.693394 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.700313 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.708021 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:41Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.755650 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.755681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.755690 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.755704 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.755716 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.857745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.857771 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.857779 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.857789 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.857797 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.959071 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.959095 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.959102 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.959112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:41 crc kubenswrapper[4618]: I0121 09:04:41.959120 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:41Z","lastTransitionTime":"2026-01-21T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.061139 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.061210 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.061220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.061231 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.061240 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.162424 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.162455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.162465 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.162476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.162485 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.264304 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.264338 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.264349 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.264361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.264370 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.365970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.365997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.366007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.366017 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.366024 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.467260 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.467293 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.467303 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.467315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.467325 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.537267 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:42 crc kubenswrapper[4618]: E0121 09:04:42.537400 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.543959 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.554568 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:34:54.699319521 +0000 UTC Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.568590 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.568617 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.568633 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.568644 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.568652 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.670452 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.670473 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.670480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.670489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.670496 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.772076 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.772110 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.772120 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.772131 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.772156 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.874066 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.874094 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.874102 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.874112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.874119 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.975456 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.975513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.975524 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.975535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:42 crc kubenswrapper[4618]: I0121 09:04:42.975542 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:42Z","lastTransitionTime":"2026-01-21T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.063168 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.063197 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.063205 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.063215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.063222 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.071782 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.073819 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.073839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.073846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.073855 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.073864 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.081238 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.083001 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.083022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.083030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.083040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.083047 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.090284 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.092122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.092154 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.092165 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.092174 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.092182 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.101485 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.103654 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.103678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.103685 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.103696 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.103702 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.110977 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:43Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.111077 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.112036 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.112059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.112068 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.112078 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.112086 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.213939 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.213977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.213988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.214000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.214008 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.315928 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.315954 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.315962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.315972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.315978 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.358397 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.358482 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:47.358465936 +0000 UTC m=+146.108933263 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.358526 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.358567 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.358623 4618 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.358649 4618 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.358671 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:47.358663429 +0000 UTC m=+146.109130756 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.358685 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:47.358679138 +0000 UTC m=+146.109146455 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.417836 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.417860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.417868 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.417876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.417882 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459061 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459080 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.458988 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.459162 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459210 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459219 4618 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459225 4618 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459247 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:47.459240374 +0000 UTC m=+146.209707680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459317 4618 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.459336 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:47.459330874 +0000 UTC m=+146.209798191 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.518845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.518895 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.518905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.518919 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.518929 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.537114 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.537269 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.537287 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.537287 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.537477 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:43 crc kubenswrapper[4618]: E0121 09:04:43.537778 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.555097 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 07:46:00.163783359 +0000 UTC Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.620889 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.620947 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.620959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.620979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.620992 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.723116 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.723164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.723174 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.723183 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.723190 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.824910 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.824937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.824946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.824956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.824963 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.926736 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.926784 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.926794 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.926809 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:43 crc kubenswrapper[4618]: I0121 09:04:43.926821 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:43Z","lastTransitionTime":"2026-01-21T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.027881 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.027915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.027923 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.027936 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.027945 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.129803 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.129850 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.129860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.129875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.129886 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.231325 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.231362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.231372 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.231384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.231392 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.332281 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.332311 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.332321 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.332335 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.332345 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.433875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.433897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.433905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.433914 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.433922 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.535403 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.535429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.535438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.535448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.535455 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.537627 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:44 crc kubenswrapper[4618]: E0121 09:04:44.537725 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.555225 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:00:23.65662525 +0000 UTC Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.637401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.637427 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.637435 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.637444 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.637453 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.739414 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.739449 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.739458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.739469 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.739479 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.840987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.841022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.841031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.841043 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.841052 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.942774 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.942802 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.942812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.942821 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:44 crc kubenswrapper[4618]: I0121 09:04:44.942830 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:44Z","lastTransitionTime":"2026-01-21T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.043967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.043996 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.044006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.044017 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.044025 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.145731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.145834 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.145847 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.145909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.145930 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.247801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.247837 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.247846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.247866 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.247875 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.350019 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.350060 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.350069 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.350083 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.350093 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.451731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.451765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.451773 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.451784 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.451793 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.537354 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.537354 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:45 crc kubenswrapper[4618]: E0121 09:04:45.537578 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.537373 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:45 crc kubenswrapper[4618]: E0121 09:04:45.537474 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:45 crc kubenswrapper[4618]: E0121 09:04:45.537638 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.553445 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.553482 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.553493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.553526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.553536 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.555759 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 10:16:12.162095547 +0000 UTC Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.655125 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.655171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.655180 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.655194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.655203 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.757335 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.757697 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.757858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.758038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.758204 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.859870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.859906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.859913 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.859926 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.859936 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.962324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.962361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.962373 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.962387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:45 crc kubenswrapper[4618]: I0121 09:04:45.962397 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:45Z","lastTransitionTime":"2026-01-21T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.064309 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.064345 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.064353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.064367 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.064377 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.166058 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.166088 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.166096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.166112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.166121 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.267946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.267970 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.267979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.267988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.267993 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.369216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.369400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.369470 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.369534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.369591 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.471320 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.471436 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.471502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.471574 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.471626 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.537036 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:46 crc kubenswrapper[4618]: E0121 09:04:46.537300 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.556832 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 18:36:27.993038142 +0000 UTC Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.573060 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.573177 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.573188 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.573198 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.573205 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.674762 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.674875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.674934 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.674994 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.675058 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.776888 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.776921 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.776931 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.776941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.776950 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.878925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.878951 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.878959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.878969 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.878977 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.980822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.980849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.980858 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.980867 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:46 crc kubenswrapper[4618]: I0121 09:04:46.980874 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:46Z","lastTransitionTime":"2026-01-21T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.082312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.082401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.082457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.082528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.082696 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.184059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.184096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.184104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.184112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.184124 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.285911 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.285943 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.285954 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.285976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.285984 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.387492 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.387518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.387526 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.387536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.387543 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.488571 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.488640 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.488652 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.488677 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.488686 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.537206 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.537238 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:47 crc kubenswrapper[4618]: E0121 09:04:47.537341 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.537367 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:47 crc kubenswrapper[4618]: E0121 09:04:47.537474 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:47 crc kubenswrapper[4618]: E0121 09:04:47.537536 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.557399 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 01:00:59.906668044 +0000 UTC Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.591117 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.591171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.591182 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.591196 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.591206 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.693386 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.693410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.693418 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.693430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.693438 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.797205 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.797231 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.797240 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.797250 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.797257 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.898824 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.898852 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.898860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.898869 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:47 crc kubenswrapper[4618]: I0121 09:04:47.898877 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:47Z","lastTransitionTime":"2026-01-21T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.000425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.000463 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.000474 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.000489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.000499 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.102325 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.102353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.102361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.102369 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.102376 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.203883 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.203911 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.203918 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.203927 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.203934 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.305870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.305905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.305914 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.305925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.305935 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.408008 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.408035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.408044 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.408054 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.408060 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.510444 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.510561 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.510638 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.510723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.510788 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.536826 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:48 crc kubenswrapper[4618]: E0121 09:04:48.536949 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.558094 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 14:39:35.981932307 +0000 UTC Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.612482 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.612619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.612708 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.612772 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.612852 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.714362 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.714393 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.714405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.714416 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.714425 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.815993 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.816081 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.816090 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.816101 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.816107 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.917733 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.917856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.917942 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.918058 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:48 crc kubenswrapper[4618]: I0121 09:04:48.918163 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:48Z","lastTransitionTime":"2026-01-21T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.019814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.019833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.019840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.019849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.019855 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.121533 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.121589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.121599 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.121608 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.121615 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.222897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.222956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.222965 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.222974 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.222981 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.324564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.324618 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.324626 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.324636 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.324643 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.425851 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.425881 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.425892 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.425904 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.425913 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.526882 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.526909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.526917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.526926 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.526932 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.537811 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.537870 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.538177 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:49 crc kubenswrapper[4618]: E0121 09:04:49.538257 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:49 crc kubenswrapper[4618]: E0121 09:04:49.538363 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:49 crc kubenswrapper[4618]: E0121 09:04:49.538412 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.538539 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:04:49 crc kubenswrapper[4618]: E0121 09:04:49.538706 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.558926 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:59:55.655321543 +0000 UTC Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.628471 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.628506 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.628518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.628531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.628540 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.729974 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.730024 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.730042 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.730053 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.730061 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.831836 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.831885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.831898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.831916 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.831928 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.933471 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.933493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.933501 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.933511 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:49 crc kubenswrapper[4618]: I0121 09:04:49.933518 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:49Z","lastTransitionTime":"2026-01-21T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.035514 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.035540 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.035548 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.035558 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.035567 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.136724 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.136784 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.136794 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.136803 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.136810 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.237960 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.237988 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.237995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.238005 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.238013 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.339424 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.339464 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.339472 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.339481 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.339488 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.440566 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.440590 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.440598 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.440610 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.440618 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.537113 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:50 crc kubenswrapper[4618]: E0121 09:04:50.537239 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.542426 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.542458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.542468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.542480 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.542490 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.559794 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 10:09:01.597874473 +0000 UTC Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.643509 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.643534 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.643541 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.643550 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.643556 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.745351 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.745373 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.745380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.745389 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.745412 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.846489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.846517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.846527 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.846707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.846716 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.948617 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.948730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.948799 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.948866 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:50 crc kubenswrapper[4618]: I0121 09:04:50.948931 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:50Z","lastTransitionTime":"2026-01-21T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.050555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.050587 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.050595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.050605 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.050613 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.152447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.152482 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.152492 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.152502 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.152509 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.253838 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.253861 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.253869 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.253878 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.253885 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.355523 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.355548 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.355557 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.355566 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.355573 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.457443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.457488 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.457499 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.457516 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.457543 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.537622 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.537644 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:51 crc kubenswrapper[4618]: E0121 09:04:51.537731 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.537750 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:51 crc kubenswrapper[4618]: E0121 09:04:51.537799 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:51 crc kubenswrapper[4618]: E0121 09:04:51.537879 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.545576 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.557916 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.558768 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.558796 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.558804 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.558814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.558821 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.560459 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 07:32:03.264573377 +0000 UTC Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.565771 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.573033 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.579704 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.586590 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.594476 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.601973 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.610616 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a8865b-9a1b-4b4f-b438-864ff1beb99a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e1cb6f08fcde9f0cc23af14feb0b5639b956cfe132528c9a2cf70cd4104b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.618536 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.626462 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.638527 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:37Z\\\",\\\"message\\\":\\\"{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 09:04:37.117656 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0121 09:04:37.117660 6767 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0121 09:04:37.117633 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0121 09:04:37.117668 6767 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117672 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117674 6767 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0121 09:04:37.117676 6767 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0121 09:04:37.117681 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0121 09:04:37.117682 6767 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.647538 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.655049 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.660414 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.660455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.660478 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.660489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.660497 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.662240 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.668363 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.676908 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.685208 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.694678 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:51Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.762109 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.762136 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.762161 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.762171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.762179 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.863667 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.863715 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.863725 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.863738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.863747 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.965022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.965052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.965059 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.965069 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:51 crc kubenswrapper[4618]: I0121 09:04:51.965076 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:51Z","lastTransitionTime":"2026-01-21T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.066552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.066576 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.066584 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.066592 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.066599 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.168046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.168077 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.168087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.168098 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.168106 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.269560 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.269587 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.269595 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.269604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.269611 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.370870 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.370898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.370906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.370915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.370922 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.473306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.473343 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.473352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.473365 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.473373 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.537345 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:52 crc kubenswrapper[4618]: E0121 09:04:52.537431 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.560848 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 12:34:52.78143551 +0000 UTC Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.574859 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.574882 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.574890 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.574900 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.574910 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.676487 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.676531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.676540 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.676550 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.676557 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.778103 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.778132 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.778156 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.778166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.778174 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.879850 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.879882 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.879890 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.879902 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.879912 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.980997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.981027 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.981035 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.981043 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:52 crc kubenswrapper[4618]: I0121 09:04:52.981051 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:52Z","lastTransitionTime":"2026-01-21T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.082529 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.082555 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.082563 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.082572 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.082578 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.184203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.184384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.184391 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.184401 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.184407 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.285674 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.285699 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.285712 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.285722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.285728 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.387733 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.387768 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.387780 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.387791 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.387800 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.464767 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.464812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.464825 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.464841 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.464853 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.476186 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.478692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.478817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.478900 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.478989 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.479073 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.486935 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.489299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.489323 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.489332 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.489341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.489349 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.496910 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.498763 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.498853 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.498916 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.498982 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.499046 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.506248 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.508505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.508532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.508540 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.508552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.508559 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.516501 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:04:53Z is after 2025-08-24T17:21:41Z" Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.516602 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.517420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.517448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.517458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.517476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.517485 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.537004 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.537022 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.537109 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.537155 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.537236 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:53 crc kubenswrapper[4618]: E0121 09:04:53.537307 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.561839 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:36:33.589045513 +0000 UTC Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.619167 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.619189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.619198 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.619207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.619215 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.720882 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.720941 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.720950 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.720959 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.720966 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.822869 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.822897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.822905 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.822915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.822923 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.924400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.924429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.924437 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.924447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:53 crc kubenswrapper[4618]: I0121 09:04:53.924456 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:53Z","lastTransitionTime":"2026-01-21T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.025417 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.025448 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.025457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.025469 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.025477 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.127408 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.127439 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.127447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.127458 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.127465 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.229094 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.229125 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.229133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.229161 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.229169 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.330692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.330741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.330750 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.330759 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.330769 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.432698 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.432734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.432743 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.432755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.432762 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.534634 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.534663 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.534671 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.534681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.534687 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.536967 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:54 crc kubenswrapper[4618]: E0121 09:04:54.537054 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.562601 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 22:29:21.550668747 +0000 UTC Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.636504 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.636530 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.636539 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.636551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.636560 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.738115 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.738259 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.738319 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.738387 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.738453 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.839954 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.839984 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.839994 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.840004 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.840013 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.941855 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.941883 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.941902 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.941912 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:54 crc kubenswrapper[4618]: I0121 09:04:54.941919 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:54Z","lastTransitionTime":"2026-01-21T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.043835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.043875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.043885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.043901 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.043910 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.146171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.146195 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.146203 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.146214 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.146223 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.247753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.247783 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.247791 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.247802 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.247808 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.349501 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.349545 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.349553 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.349566 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.349574 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.451585 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.451617 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.451627 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.451639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.451654 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.537335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.537387 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:55 crc kubenswrapper[4618]: E0121 09:04:55.537422 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:55 crc kubenswrapper[4618]: E0121 09:04:55.537523 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.537548 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:55 crc kubenswrapper[4618]: E0121 09:04:55.537593 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.552986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.553020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.553030 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.553040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.553048 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.563451 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 04:24:28.70639612 +0000 UTC Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.655622 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.655650 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.655660 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.655671 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.655680 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.756844 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.756876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.756902 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.756915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.756923 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.859118 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.859166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.859176 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.859187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.859195 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.960398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.960429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.960437 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.960451 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:55 crc kubenswrapper[4618]: I0121 09:04:55.960459 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:55Z","lastTransitionTime":"2026-01-21T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.062461 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.062485 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.062493 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.062504 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.062512 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.163924 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.163945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.163953 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.163979 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.163988 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.265958 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.265989 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.265997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.266006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.266013 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.367692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.367740 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.367750 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.367762 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.367770 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.469531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.469553 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.469561 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.469571 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.469578 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.537766 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:56 crc kubenswrapper[4618]: E0121 09:04:56.537861 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.564343 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:43:03.408024184 +0000 UTC Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.571336 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.571357 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.571365 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.571375 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.571383 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.672620 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.672648 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.672655 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.672665 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.672673 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.773814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.773841 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.773851 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.773860 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.773867 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.875382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.875411 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.875419 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.875428 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.875437 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.976829 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.976859 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.976867 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.976878 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:56 crc kubenswrapper[4618]: I0121 09:04:56.976885 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:56Z","lastTransitionTime":"2026-01-21T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.078704 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.078731 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.078753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.078765 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.078773 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.180369 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.180412 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.180420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.180431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.180439 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.282275 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.282305 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.282312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.282322 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.282330 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.383812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.383839 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.383848 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.383856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.383863 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.485441 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.485479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.485488 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.485501 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.485510 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.536792 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.536833 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:57 crc kubenswrapper[4618]: E0121 09:04:57.536901 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.536981 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:57 crc kubenswrapper[4618]: E0121 09:04:57.536996 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:57 crc kubenswrapper[4618]: E0121 09:04:57.537114 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.564687 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:30:05.185361488 +0000 UTC Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.587116 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.587243 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.587306 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.587368 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.587430 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.689413 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.689575 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.689639 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.689694 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.689761 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.791872 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.791909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.791917 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.791930 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.791938 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.894300 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.894331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.894340 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.894351 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.894358 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.996165 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.996204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.996216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.996231 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:57 crc kubenswrapper[4618]: I0121 09:04:57.996241 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:57Z","lastTransitionTime":"2026-01-21T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.098000 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.098031 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.098039 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.098052 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.098060 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.199935 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.199967 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.199975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.199986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.199993 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.301619 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.301651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.301661 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.301673 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.301681 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.403166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.403197 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.403204 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.403214 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.403223 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.505515 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.505543 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.505551 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.505561 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.505569 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.536970 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:04:58 crc kubenswrapper[4618]: E0121 09:04:58.537214 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.565523 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:06:48.757996302 +0000 UTC Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.606976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.607003 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.607011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.607020 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.607029 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.708822 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.708850 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.708857 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.708867 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.708875 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.810240 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.810260 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.810268 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.810276 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.810283 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.911503 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.911529 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.911537 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.911546 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:58 crc kubenswrapper[4618]: I0121 09:04:58.911556 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:58Z","lastTransitionTime":"2026-01-21T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.012734 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.012777 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.012785 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.012793 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.012804 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.114381 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.114431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.114441 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.114450 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.114457 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.215801 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.215833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.215844 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.215856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.215865 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.317079 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.317104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.317112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.317121 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.317128 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.419040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.419078 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.419090 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.419107 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.419118 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.520876 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.520898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.520906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.520914 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.520921 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.537473 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.537504 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:04:59 crc kubenswrapper[4618]: E0121 09:04:59.537556 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.537612 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:04:59 crc kubenswrapper[4618]: E0121 09:04:59.537659 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:04:59 crc kubenswrapper[4618]: E0121 09:04:59.537715 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.566291 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 19:06:06.558251319 +0000 UTC Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.622885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.622915 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.622924 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.622933 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.622940 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.724549 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.724581 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.724594 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.724605 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.724614 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.826138 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.826207 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.826219 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.826230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.826238 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.927695 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.927737 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.927744 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.927753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:04:59 crc kubenswrapper[4618]: I0121 09:04:59.927768 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:04:59Z","lastTransitionTime":"2026-01-21T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.029605 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.029632 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.029640 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.029651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.029659 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.131622 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.131662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.131672 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.131681 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.131688 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.233298 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.233352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.233366 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.233384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.233397 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.289792 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:00 crc kubenswrapper[4618]: E0121 09:05:00.289870 4618 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:05:00 crc kubenswrapper[4618]: E0121 09:05:00.289918 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs podName:d164c95c-cb58-47e7-a3a3-7e7bce8b9743 nodeName:}" failed. No retries permitted until 2026-01-21 09:06:04.289907233 +0000 UTC m=+163.040374550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs") pod "network-metrics-daemon-kpxzc" (UID: "d164c95c-cb58-47e7-a3a3-7e7bce8b9743") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.334942 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.334968 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.334977 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.334986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.334993 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.436633 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.436675 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.436683 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.436692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.436698 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.537651 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:00 crc kubenswrapper[4618]: E0121 09:05:00.537722 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.538934 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.538962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.538972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.538984 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.538992 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.566668 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 15:57:33.155599016 +0000 UTC Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.641089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.641114 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.641122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.641131 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.641154 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.742931 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.742957 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.742965 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.742975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.742982 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.844483 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.844508 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.844518 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.844532 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.844540 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.946641 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.946691 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.946701 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.946715 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:00 crc kubenswrapper[4618]: I0121 09:05:00.946726 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:00Z","lastTransitionTime":"2026-01-21T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.048760 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.048800 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.048808 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.048817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.048826 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.150274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.150299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.150307 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.150317 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.150325 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.251682 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.251730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.251738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.251747 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.251755 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.353633 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.353670 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.353679 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.353692 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.353701 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.455192 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.455226 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.455235 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.455250 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.455270 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.536965 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:01 crc kubenswrapper[4618]: E0121 09:05:01.537053 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.537063 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.537093 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:01 crc kubenswrapper[4618]: E0121 09:05:01.537127 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:01 crc kubenswrapper[4618]: E0121 09:05:01.537205 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.544619 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-drdgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2972a05a-04de-4c13-8436-38cbbac3a4a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://758c06e5f53920c7d6314469e6f618e63001aa303249dfdf92b57dc04131b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-drdgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.551720 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f819fb41-8eb7-4f8f-85f9-752aa5716cca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e03ba0a384a480b95d229b2f40a158d624b7a582ebea8fe04e24d98fdd94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sddf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2bm47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.556239 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.556262 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.556272 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.556284 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.556293 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.564023 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"992361e5-8eb9-426d-9eed-afffb0c30615\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:37Z\\\",\\\"message\\\":\\\"{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0121 09:04:37.117656 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0121 09:04:37.117660 6767 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0121 09:04:37.117633 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0121 09:04:37.117668 6767 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117672 6767 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0121 09:04:37.117674 6767 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0121 09:04:37.117676 6767 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0121 09:04:37.117681 6767 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0121 09:04:37.117682 6767 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:04:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c58lk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-894tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.567077 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 09:43:56.419747031 +0000 UTC Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.571525 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a869fb76-1885-4284-8958-4979646d3c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c11f639a8a3d3f3f97e13c63e54d4098011e11191266ed0dd96c0d83f249577\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://701701b2bfba9863375b501533d46676cb575ca4a925c7333e24db9c7104171d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da4446949fa5699598715c0fd0b8d74b47d7844d2a8016c1bc0d9a602bcb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.579267 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.586599 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dee096118681e0c0cb4c9e526a371a21559b808491742a24e79047c82ea1d4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.594184 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b92189590ade9e8692a47def90b57ae5a60f2ef4f40fc79666153b7dd20c07f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56640c13f9c7e702969b68b677361482e736feb4320584fc791782b39271db88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.603069 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-24dd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32082919-a07c-414d-b784-1ad042460385\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1b7601a4cab25d3164ceafbcc7848be00c8a9df25ac9307a7d924f14ccc473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c171065aa473d03bd8f98973830547b3ef553aa36109a5d43d2eb0a768f9f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53bd3c368effc9a21ff22b4ac9ae5629c25e5c6e8f55d994abbdd02aeea4fd01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c61f8c9a2c5c05d85a72db4c32ae95ff5d431251b00fa8f59479b291dd81afae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7df6ffdaf795a4f24655b5fdd4a4d1aecd3a6298637c7be681a63eab893717c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82afe5341ab2af447f97831d1d465a336232c039443b74923dcc58087c7e30f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9244cc6d4b979a666f205feebcbc52faa387f47226449f6b76bea3ac62522bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbtt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-24dd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.609675 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fbzbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83f21b2-3e45-4972-a51f-75b5a51495dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76884b051d77234b1d2c8e7f8c60ba32d8d451acf7187c0ebb76f92f7fbad650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fbzbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.617225 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2cb204f-4bf6-441d-95e5-b8fb8644948d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e189df688b68aba03fbba61d8f9a4fb0f3fe3d13661dc5124b1da8d9f7c31c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63de47e79d5aacc8b4d0277bbacca06f3301cacba64dbd6601abf314c3d4e92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cktk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pcpbx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.623368 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqbsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpxzc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.635393 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a3aa0ed-0a23-4291-aa14-9fa3a1d2ba2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a159889fb78df41d300f15a6f677752e85f314e397bf318394310d11e8fe79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e677d9e203793d9e40fc7242422c1c0c5ed0c7af5efca1766e8d6ad68ffc3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c4ef73f18a12e510740b523bb9662f13efdceeea0dc49631ebf2721d68c9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418f66d2ccc734bc1abdbb6a4b3be0bcccc0f3666f1d8a849cd0fb9f7d7446c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://957730ecfebc1589548c950ab9651214eac1a5a37415ff5ff710094a5f3cdf86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d9a18e441fab0a4276cadca79a7322111a682376ed3afc218fd281a01e5c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d00a0676c8dd5828fe6ad81c20672797e0027136972d8293b25c410266c6ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed0c499ef2355d8f7af5df44ccaeb6b8032c2b415b84eb2518314d4ba195d674\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.643203 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6c6988197622837f89462181ecd56200cb442479d0c5c748de4bbfd50971dc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.650389 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.657382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.657482 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.657554 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.657618 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.657687 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.658730 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.666598 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m6jz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052a66c4-94ce-4336-93f6-1d0023e58cc4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-21T09:04:29Z\\\",\\\"message\\\":\\\"2026-01-21T09:03:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817\\\\n2026-01-21T09:03:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d181d51-3fbf-48d9-9db9-4131f708e817 to /host/opt/cni/bin/\\\\n2026-01-21T09:03:44Z [verbose] multus-daemon started\\\\n2026-01-21T09:03:44Z [verbose] Readiness Indicator file check\\\\n2026-01-21T09:04:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dps94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m6jz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.674913 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"597ab3f9-d061-44c9-9a42-5abcfa77a11d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T09:03:38Z\\\",\\\"message\\\":\\\"information is complete\\\\nW0121 09:03:38.777283 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 09:03:38.777301 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 09:03:38.777303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 09:03:38.777306 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 09:03:38.779435 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1768986203\\\\\\\\\\\\\\\" (2026-01-21 09:03:22 +0000 UTC to 2026-02-20 09:03:23 +0000 UTC (now=2026-01-21 09:03:38.779414316 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779533 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1768986213\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1768986213\\\\\\\\\\\\\\\" (2026-01-21 08:03:33 +0000 UTC to 2027-01-21 08:03:33 +0000 UTC (now=2026-01-21 09:03:38.779513663 +0000 UTC))\\\\\\\"\\\\nI0121 09:03:38.779544 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0121 09:03:38.779559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0121 09:03:38.779573 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779592 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0121 09:03:38.779614 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3395039996/tls.crt::/tmp/serving-cert-3395039996/tls.key\\\\\\\"\\\\nF0121 09:03:38.779654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 09:03:38.781946 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.681977 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1779f0a-c4f1-4bd2-80df-d7104929f589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cace745ca92d75e50912f95767f70b3af664de27b2e5b2c9c684f6b24639932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1039e5ae59e77b0697866327bd25dff5a355319d94a6464389a72128cc2268c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e46b99a226d98e05df7685fb5b3b944bdc3182f0a0923c5f12b75b8597fee8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2728ed44d9128c747eb7491c5be1b41f13eb3ae46a2c304831b2cb8b385fb4e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.687964 4618 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a8865b-9a1b-4b4f-b438-864ff1beb99a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T09:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e1cb6f08fcde9f0cc23af14feb0b5639b956cfe132528c9a2cf70cd4104b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://480d85e64344fc46aa5e256a518d41ac98b9c42b510611e8a50e23e85b0bcd25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T09:03:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T09:03:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T09:03:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:01Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.759635 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.759664 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.759675 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.759687 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.759696 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.860830 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.860848 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.860856 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.860865 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.860873 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.962181 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.962212 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.962220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.962230 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:01 crc kubenswrapper[4618]: I0121 09:05:01.962238 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:01Z","lastTransitionTime":"2026-01-21T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.065817 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.065857 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.065871 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.065882 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.065890 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.167505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.167528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.167535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.167546 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.167552 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.268745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.268812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.268827 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.268848 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.268857 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.370336 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.370384 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.370394 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.370405 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.370413 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.472402 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.472430 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.472438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.472449 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.472457 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.537200 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:02 crc kubenswrapper[4618]: E0121 09:05:02.537402 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.568107 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 23:18:37.322205931 +0000 UTC Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.574672 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.574713 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.574722 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.574738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.574748 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.676539 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.676570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.676578 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.676589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.676598 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.778528 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.778556 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.778564 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.778574 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.778582 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.880113 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.880175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.880185 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.880198 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.880210 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.982380 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.982411 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.982420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.982431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:02 crc kubenswrapper[4618]: I0121 09:05:02.982456 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:02Z","lastTransitionTime":"2026-01-21T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.084594 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.084629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.084640 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.084651 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.084659 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.186453 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.186497 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.186508 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.186519 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.186527 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.287730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.287764 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.287800 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.287813 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.287821 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.389246 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.389285 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.389293 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.389305 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.389315 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.491054 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.491087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.491096 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.491108 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.491117 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.537417 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.537490 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.537533 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.537589 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.537641 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.537710 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.568426 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:59:47.877286588 +0000 UTC Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.592779 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.592823 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.592833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.592845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.592856 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.694659 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.694678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.694685 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.694693 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.694702 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.695406 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.695434 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.695442 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.695455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.695466 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.704013 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:03Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.706710 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.706738 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.706747 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.706758 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.706766 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.714650 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:03Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.716744 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.716767 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.716775 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.716794 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.716801 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.724336 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:03Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.726431 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.726456 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.726464 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.726472 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.726479 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.733757 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:03Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.735648 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.735670 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.735678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.735687 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.735694 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.743302 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-21T09:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ba5df0e4-fb86-421b-948d-88ebb5510825\\\",\\\"systemUUID\\\":\\\"0573d94e-a416-4c81-b057-07a8619fdfca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T09:05:03Z is after 2025-08-24T17:21:41Z" Jan 21 09:05:03 crc kubenswrapper[4618]: E0121 09:05:03.743400 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.796087 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.796127 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.796136 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.796181 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.796191 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.897760 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.897818 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.897829 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.897840 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.897847 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.999098 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.999161 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.999170 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.999181 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:03 crc kubenswrapper[4618]: I0121 09:05:03.999188 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:03Z","lastTransitionTime":"2026-01-21T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.101536 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.101769 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.101778 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.101799 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.101823 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.203224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.203253 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.203263 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.203274 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.203282 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.304972 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.304995 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.305003 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.305012 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.305019 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.407024 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.407047 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.407055 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.407064 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.407071 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.508095 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.508125 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.508166 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.508175 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.508182 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.537815 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:04 crc kubenswrapper[4618]: E0121 09:05:04.538492 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.538759 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:05:04 crc kubenswrapper[4618]: E0121 09:05:04.538942 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.569377 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:36:29.853768424 +0000 UTC Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.610248 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.610291 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.610299 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.610312 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.610320 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.711748 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.711788 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.711804 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.711842 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.711852 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.813576 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.813597 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.813604 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.813613 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.813619 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.915628 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.915654 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.915663 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.915673 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:04 crc kubenswrapper[4618]: I0121 09:05:04.915681 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:04Z","lastTransitionTime":"2026-01-21T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.017656 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.017682 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.017691 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.017701 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.017711 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.119443 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.119469 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.119479 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.119489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.119498 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.221410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.221428 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.221436 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.221444 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.221451 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.323068 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.323102 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.323110 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.323122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.323130 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.424745 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.424772 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.424779 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.424791 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.424809 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.526083 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.526104 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.526112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.526121 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.526128 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.537611 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.537651 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.537612 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:05 crc kubenswrapper[4618]: E0121 09:05:05.537686 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:05 crc kubenswrapper[4618]: E0121 09:05:05.537732 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:05 crc kubenswrapper[4618]: E0121 09:05:05.537813 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.569743 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:14:20.884275223 +0000 UTC Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.628101 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.628216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.628280 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.628364 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.628427 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.729678 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.729714 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.729723 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.729735 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.729743 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.831927 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.831965 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.831973 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.831987 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.831996 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.934048 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.934076 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.934086 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.934097 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:05 crc kubenswrapper[4618]: I0121 09:05:05.934105 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:05Z","lastTransitionTime":"2026-01-21T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.035304 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.035425 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.035521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.035602 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.035673 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.136884 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.137008 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.137109 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.137206 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.137280 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.238341 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.238382 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.238390 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.238398 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.238405 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.339849 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.339875 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.339884 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.339894 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.339901 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.441672 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.441697 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.441705 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.441714 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.441721 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.537539 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:06 crc kubenswrapper[4618]: E0121 09:05:06.537784 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.543045 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.543070 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.543079 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.543089 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.543096 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.570626 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 05:47:01.665638159 +0000 UTC Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.645122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.645164 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.645173 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.645182 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.645189 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.746457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.746504 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.746512 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.746525 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.746537 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.847752 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.847781 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.847790 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.847799 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.847815 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.949438 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.949467 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.949475 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.949488 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:06 crc kubenswrapper[4618]: I0121 09:05:06.949503 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:06Z","lastTransitionTime":"2026-01-21T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.051567 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.051599 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.051609 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.051620 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.051647 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.153468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.153512 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.153521 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.153531 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.153538 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.255552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.255576 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.255583 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.255592 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.255599 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.356908 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.356929 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.356937 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.356945 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.356952 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.458003 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.458048 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.458056 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.458066 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.458073 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.536710 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.536728 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:07 crc kubenswrapper[4618]: E0121 09:05:07.536785 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.536853 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:07 crc kubenswrapper[4618]: E0121 09:05:07.536966 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:07 crc kubenswrapper[4618]: E0121 09:05:07.537027 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.559420 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.559447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.559457 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.559468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.559476 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.570685 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 11:47:43.176772316 +0000 UTC Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.660331 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.660383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.660392 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.660402 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.660409 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.762084 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.762113 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.762121 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.762133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.762161 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.863983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.864004 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.864012 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.864022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.864028 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.965785 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.965806 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.965814 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.965833 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:07 crc kubenswrapper[4618]: I0121 09:05:07.965841 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:07Z","lastTransitionTime":"2026-01-21T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.067554 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.067574 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.067582 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.067590 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.067596 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.169324 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.169345 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.169353 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.169361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.169367 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.270657 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.270690 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.270700 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.270710 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.270719 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.372187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.372215 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.372224 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.372236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.372245 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.474484 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.474505 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.474513 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.474522 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.474528 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.536724 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:08 crc kubenswrapper[4618]: E0121 09:05:08.536834 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.571211 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:38:12.992890689 +0000 UTC Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.576189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.576213 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.576220 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.576229 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.576236 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.677907 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.677946 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.677961 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.677975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.677984 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.779622 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.779647 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.779654 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.779662 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.779669 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.880700 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.880730 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.880741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.880753 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.880760 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.982854 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.982880 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.982888 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.982897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:08 crc kubenswrapper[4618]: I0121 09:05:08.982904 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:08Z","lastTransitionTime":"2026-01-21T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.084980 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.085005 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.085013 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.085022 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.085030 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.186589 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.186614 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.186621 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.186631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.186639 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.288006 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.288103 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.288189 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.288256 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.288323 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.389865 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.389890 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.389898 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.389906 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.389912 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.491741 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.491885 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.491975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.492081 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.492156 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.536861 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.536912 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:09 crc kubenswrapper[4618]: E0121 09:05:09.536976 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.536871 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:09 crc kubenswrapper[4618]: E0121 09:05:09.537109 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:09 crc kubenswrapper[4618]: E0121 09:05:09.537192 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.572005 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:44:05.716738971 +0000 UTC Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.594124 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.594178 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.594187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.594199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.594207 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.695587 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.695621 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.695631 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.695645 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.695654 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.796897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.797040 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.797112 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.797212 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.797297 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.899447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.899489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.899499 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.899517 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:09 crc kubenswrapper[4618]: I0121 09:05:09.899526 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:09Z","lastTransitionTime":"2026-01-21T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.001039 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.001152 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.001225 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.001292 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.001353 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.103400 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.103421 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.103429 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.103439 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.103446 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.204845 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.204871 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.204878 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.204889 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.204897 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.306572 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.306599 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.306606 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.306615 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.306642 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.408293 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.408383 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.408397 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.408408 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.408418 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.509955 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.509986 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.509996 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.510007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.510015 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.537340 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:10 crc kubenswrapper[4618]: E0121 09:05:10.537454 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.573272 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:35:42.938988084 +0000 UTC Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.611610 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.611634 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.611643 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.611653 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.611660 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.713194 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.713255 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.713266 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.713280 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.713291 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.814896 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.814925 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.814933 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.814942 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.814950 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.916422 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.916447 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.916455 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.916465 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:10 crc kubenswrapper[4618]: I0121 09:05:10.916472 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:10Z","lastTransitionTime":"2026-01-21T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.018489 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.018515 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.018522 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.018533 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.018542 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.119922 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.119975 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.119983 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.119996 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.120005 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.222091 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.222122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.222130 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.222159 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.222169 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.323378 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.323410 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.323462 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.323478 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.323489 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.424805 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.424832 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.424850 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.424861 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.424868 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.526191 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.526222 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.526232 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.526243 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.526250 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.537435 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.537474 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:11 crc kubenswrapper[4618]: E0121 09:05:11.537530 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.537619 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:11 crc kubenswrapper[4618]: E0121 09:05:11.537757 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:11 crc kubenswrapper[4618]: E0121 09:05:11.537861 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.559832 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pcpbx" podStartSLOduration=88.559818226 podStartE2EDuration="1m28.559818226s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.552491629 +0000 UTC m=+110.302958946" watchObservedRunningTime="2026-01-21 09:05:11.559818226 +0000 UTC m=+110.310285544" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.573546 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 11:57:05.877074746 +0000 UTC Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.576251 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=89.576240623 podStartE2EDuration="1m29.576240623s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.575497991 +0000 UTC m=+110.325965308" watchObservedRunningTime="2026-01-21 09:05:11.576240623 +0000 UTC m=+110.326707950" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.614640 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fbzbk" podStartSLOduration=89.614621134 podStartE2EDuration="1m29.614621134s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.604301297 +0000 UTC m=+110.354768614" watchObservedRunningTime="2026-01-21 09:05:11.614621134 +0000 UTC m=+110.365088451" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.615014 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m6jz5" podStartSLOduration=89.615007804 podStartE2EDuration="1m29.615007804s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.614561292 +0000 UTC m=+110.365028609" watchObservedRunningTime="2026-01-21 09:05:11.615007804 +0000 UTC m=+110.365475121" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.624641 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.624624024 podStartE2EDuration="1m32.624624024s" podCreationTimestamp="2026-01-21 09:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.624394972 +0000 UTC m=+110.374862289" watchObservedRunningTime="2026-01-21 09:05:11.624624024 +0000 UTC m=+110.375091341" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.627596 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.627621 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.627630 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.627641 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.627649 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.639703 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=64.639688939 podStartE2EDuration="1m4.639688939s" podCreationTimestamp="2026-01-21 09:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.633497403 +0000 UTC m=+110.383964720" watchObservedRunningTime="2026-01-21 09:05:11.639688939 +0000 UTC m=+110.390156256" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.639779 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.639775453 podStartE2EDuration="29.639775453s" podCreationTimestamp="2026-01-21 09:04:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.63967914 +0000 UTC m=+110.390146457" watchObservedRunningTime="2026-01-21 09:05:11.639775453 +0000 UTC m=+110.390242770" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.670136 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podStartSLOduration=89.670119976 podStartE2EDuration="1m29.670119976s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.655490634 +0000 UTC m=+110.405957951" watchObservedRunningTime="2026-01-21 09:05:11.670119976 +0000 UTC m=+110.420587283" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.679041 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.679028911 podStartE2EDuration="1m29.679028911s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.678962385 +0000 UTC m=+110.429429702" watchObservedRunningTime="2026-01-21 09:05:11.679028911 +0000 UTC m=+110.429496228" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.700448 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-drdgl" podStartSLOduration=89.700437849 podStartE2EDuration="1m29.700437849s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.700132804 +0000 UTC m=+110.450600131" watchObservedRunningTime="2026-01-21 09:05:11.700437849 +0000 UTC m=+110.450905166" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.719168 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-24dd7" podStartSLOduration=89.719157031 podStartE2EDuration="1m29.719157031s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:11.718449435 +0000 UTC m=+110.468916753" watchObservedRunningTime="2026-01-21 09:05:11.719157031 +0000 UTC m=+110.469624348" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.729962 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.729993 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.730001 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.730011 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.730021 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.831787 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.831816 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.831824 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.831835 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.831855 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.933751 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.933803 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.933812 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.933826 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:11 crc kubenswrapper[4618]: I0121 09:05:11.933834 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:11Z","lastTransitionTime":"2026-01-21T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.035197 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.035227 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.035236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.035247 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.035256 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.136570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.136608 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.136616 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.136629 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.136637 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.238481 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.238515 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.238524 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.238535 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.238543 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.340176 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.340214 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.340223 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.340236 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.340281 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.441976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.442007 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.442015 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.442026 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.442036 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.537527 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:12 crc kubenswrapper[4618]: E0121 09:05:12.537610 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.543707 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.543755 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.543766 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.543779 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.543789 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.573814 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 01:50:01.484574128 +0000 UTC Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.645413 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.645468 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.645476 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.645495 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.645504 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.747543 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.747569 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.747578 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.747588 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.747597 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.849552 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.849581 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.849591 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.849603 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.849611 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.951171 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.951206 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.951216 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.951228 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:12 crc kubenswrapper[4618]: I0121 09:05:12.951238 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:12Z","lastTransitionTime":"2026-01-21T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.053315 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.053352 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.053361 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.053373 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.053383 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.154820 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.154851 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.154868 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.154880 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.154887 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.256074 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.256099 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.256122 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.256133 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.256166 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.358046 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.358073 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.358082 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.358092 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.358101 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.460119 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.460177 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.460187 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.460199 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.460207 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.537762 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.537802 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:13 crc kubenswrapper[4618]: E0121 09:05:13.537880 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.537888 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:13 crc kubenswrapper[4618]: E0121 09:05:13.537950 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:13 crc kubenswrapper[4618]: E0121 09:05:13.538088 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.561533 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.561562 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.561570 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.561581 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.561590 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.574041 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:18:07.356543627 +0000 UTC Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.662846 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.662888 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.662897 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.662909 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.662917 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.763997 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.764028 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.764039 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.764051 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.764063 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.865038 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.865072 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.865080 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.865093 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.865100 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.967285 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.967317 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.967325 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.967337 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:13 crc kubenswrapper[4618]: I0121 09:05:13.967346 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:13Z","lastTransitionTime":"2026-01-21T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.032923 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.032956 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.032964 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.032976 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.032986 4618 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T09:05:14Z","lastTransitionTime":"2026-01-21T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.058848 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9"] Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.059162 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.060514 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.060770 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.061994 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.062193 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.198958 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c44881e5-dfde-4220-b9ed-47d6d62f6121-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.198987 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.199011 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c44881e5-dfde-4220-b9ed-47d6d62f6121-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.199044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c44881e5-dfde-4220-b9ed-47d6d62f6121-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.199059 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300203 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c44881e5-dfde-4220-b9ed-47d6d62f6121-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300228 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300247 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c44881e5-dfde-4220-b9ed-47d6d62f6121-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300273 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300285 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c44881e5-dfde-4220-b9ed-47d6d62f6121-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300627 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300655 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c44881e5-dfde-4220-b9ed-47d6d62f6121-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.300935 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c44881e5-dfde-4220-b9ed-47d6d62f6121-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.304281 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c44881e5-dfde-4220-b9ed-47d6d62f6121-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.312371 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c44881e5-dfde-4220-b9ed-47d6d62f6121-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hsqx9\" (UID: \"c44881e5-dfde-4220-b9ed-47d6d62f6121\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.369301 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.537701 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:14 crc kubenswrapper[4618]: E0121 09:05:14.537928 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.575135 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 00:17:26.013619955 +0000 UTC Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.575200 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.579668 4618 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.865744 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" event={"ID":"c44881e5-dfde-4220-b9ed-47d6d62f6121","Type":"ContainerStarted","Data":"0ad907560d686d3a06f6b81c23700b611b99e45ed2a35b1891434b1b7d2d8d0a"} Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.865830 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" event={"ID":"c44881e5-dfde-4220-b9ed-47d6d62f6121","Type":"ContainerStarted","Data":"bd80a38ff2a87765e207414eb43f8a23674f52a46b2c938d6f1d30aa2d9ae623"} Jan 21 09:05:14 crc kubenswrapper[4618]: I0121 09:05:14.874994 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hsqx9" podStartSLOduration=92.874981348 podStartE2EDuration="1m32.874981348s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:14.874402455 +0000 UTC m=+113.624869773" watchObservedRunningTime="2026-01-21 09:05:14.874981348 +0000 UTC m=+113.625448665" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.537202 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.537230 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.537200 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:15 crc kubenswrapper[4618]: E0121 09:05:15.537565 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.537649 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:05:15 crc kubenswrapper[4618]: E0121 09:05:15.537671 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:15 crc kubenswrapper[4618]: E0121 09:05:15.537736 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:15 crc kubenswrapper[4618]: E0121 09:05:15.537765 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-894tg_openshift-ovn-kubernetes(992361e5-8eb9-426d-9eed-afffb0c30615)\"" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.869373 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/1.log" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.870084 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/0.log" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.870126 4618 generic.go:334] "Generic (PLEG): container finished" podID="052a66c4-94ce-4336-93f6-1d0023e58cc4" containerID="0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0" exitCode=1 Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.870168 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerDied","Data":"0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0"} Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.870209 4618 scope.go:117] "RemoveContainer" containerID="79112fe0a400cd9cda00ed388965be53ba221747388a21692d7f32d486f635ba" Jan 21 09:05:15 crc kubenswrapper[4618]: I0121 09:05:15.870447 4618 scope.go:117] "RemoveContainer" containerID="0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0" Jan 21 09:05:15 crc kubenswrapper[4618]: E0121 09:05:15.870641 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-m6jz5_openshift-multus(052a66c4-94ce-4336-93f6-1d0023e58cc4)\"" pod="openshift-multus/multus-m6jz5" podUID="052a66c4-94ce-4336-93f6-1d0023e58cc4" Jan 21 09:05:16 crc kubenswrapper[4618]: I0121 09:05:16.536860 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:16 crc kubenswrapper[4618]: E0121 09:05:16.536963 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:16 crc kubenswrapper[4618]: I0121 09:05:16.872872 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/1.log" Jan 21 09:05:17 crc kubenswrapper[4618]: I0121 09:05:17.537559 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:17 crc kubenswrapper[4618]: I0121 09:05:17.537613 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:17 crc kubenswrapper[4618]: E0121 09:05:17.537651 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:17 crc kubenswrapper[4618]: I0121 09:05:17.537672 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:17 crc kubenswrapper[4618]: E0121 09:05:17.537750 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:17 crc kubenswrapper[4618]: E0121 09:05:17.537804 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:18 crc kubenswrapper[4618]: I0121 09:05:18.536940 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:18 crc kubenswrapper[4618]: E0121 09:05:18.537030 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:19 crc kubenswrapper[4618]: I0121 09:05:19.537539 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:19 crc kubenswrapper[4618]: I0121 09:05:19.537574 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:19 crc kubenswrapper[4618]: E0121 09:05:19.537650 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:19 crc kubenswrapper[4618]: I0121 09:05:19.537550 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:19 crc kubenswrapper[4618]: E0121 09:05:19.537726 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:19 crc kubenswrapper[4618]: E0121 09:05:19.537856 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:20 crc kubenswrapper[4618]: I0121 09:05:20.536884 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:20 crc kubenswrapper[4618]: E0121 09:05:20.536987 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:21 crc kubenswrapper[4618]: I0121 09:05:21.537311 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:21 crc kubenswrapper[4618]: I0121 09:05:21.537344 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:21 crc kubenswrapper[4618]: E0121 09:05:21.538450 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:21 crc kubenswrapper[4618]: I0121 09:05:21.538479 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:21 crc kubenswrapper[4618]: E0121 09:05:21.538539 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:21 crc kubenswrapper[4618]: E0121 09:05:21.538641 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:21 crc kubenswrapper[4618]: E0121 09:05:21.605792 4618 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 21 09:05:22 crc kubenswrapper[4618]: I0121 09:05:22.537785 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:22 crc kubenswrapper[4618]: E0121 09:05:22.537879 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:23 crc kubenswrapper[4618]: I0121 09:05:23.537637 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:23 crc kubenswrapper[4618]: I0121 09:05:23.537686 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:23 crc kubenswrapper[4618]: E0121 09:05:23.537754 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:23 crc kubenswrapper[4618]: I0121 09:05:23.537762 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:23 crc kubenswrapper[4618]: E0121 09:05:23.537842 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:23 crc kubenswrapper[4618]: E0121 09:05:23.537916 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:24 crc kubenswrapper[4618]: I0121 09:05:24.537225 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:24 crc kubenswrapper[4618]: E0121 09:05:24.537332 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:25 crc kubenswrapper[4618]: I0121 09:05:25.537095 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:25 crc kubenswrapper[4618]: E0121 09:05:25.537203 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:25 crc kubenswrapper[4618]: I0121 09:05:25.537239 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:25 crc kubenswrapper[4618]: I0121 09:05:25.537267 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:25 crc kubenswrapper[4618]: E0121 09:05:25.537369 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:25 crc kubenswrapper[4618]: E0121 09:05:25.537276 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.537274 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:26 crc kubenswrapper[4618]: E0121 09:05:26.537377 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.537826 4618 scope.go:117] "RemoveContainer" containerID="0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.537862 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:05:26 crc kubenswrapper[4618]: E0121 09:05:26.606587 4618 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.895549 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/1.log" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.895627 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerStarted","Data":"5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a"} Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.897517 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/3.log" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.902757 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerStarted","Data":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.903054 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:05:26 crc kubenswrapper[4618]: I0121 09:05:26.931564 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podStartSLOduration=104.931550263 podStartE2EDuration="1m44.931550263s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:26.931541987 +0000 UTC m=+125.682009304" watchObservedRunningTime="2026-01-21 09:05:26.931550263 +0000 UTC m=+125.682017580" Jan 21 09:05:27 crc kubenswrapper[4618]: I0121 09:05:27.149376 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpxzc"] Jan 21 09:05:27 crc kubenswrapper[4618]: I0121 09:05:27.149470 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:27 crc kubenswrapper[4618]: E0121 09:05:27.149541 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:27 crc kubenswrapper[4618]: I0121 09:05:27.537492 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:27 crc kubenswrapper[4618]: I0121 09:05:27.537526 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:27 crc kubenswrapper[4618]: I0121 09:05:27.537558 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:27 crc kubenswrapper[4618]: E0121 09:05:27.537606 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:27 crc kubenswrapper[4618]: E0121 09:05:27.537674 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:27 crc kubenswrapper[4618]: E0121 09:05:27.537777 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:28 crc kubenswrapper[4618]: I0121 09:05:28.537364 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:28 crc kubenswrapper[4618]: E0121 09:05:28.537472 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:29 crc kubenswrapper[4618]: I0121 09:05:29.537117 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:29 crc kubenswrapper[4618]: I0121 09:05:29.537126 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:29 crc kubenswrapper[4618]: E0121 09:05:29.537249 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:29 crc kubenswrapper[4618]: I0121 09:05:29.537348 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:29 crc kubenswrapper[4618]: E0121 09:05:29.537415 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:29 crc kubenswrapper[4618]: E0121 09:05:29.537467 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:30 crc kubenswrapper[4618]: I0121 09:05:30.537666 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:30 crc kubenswrapper[4618]: E0121 09:05:30.537774 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpxzc" podUID="d164c95c-cb58-47e7-a3a3-7e7bce8b9743" Jan 21 09:05:31 crc kubenswrapper[4618]: I0121 09:05:31.537678 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:31 crc kubenswrapper[4618]: I0121 09:05:31.537711 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:31 crc kubenswrapper[4618]: E0121 09:05:31.537858 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 09:05:31 crc kubenswrapper[4618]: I0121 09:05:31.538068 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:31 crc kubenswrapper[4618]: E0121 09:05:31.538175 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 09:05:31 crc kubenswrapper[4618]: E0121 09:05:31.538256 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 09:05:32 crc kubenswrapper[4618]: I0121 09:05:32.536768 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:05:32 crc kubenswrapper[4618]: I0121 09:05:32.538508 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 09:05:32 crc kubenswrapper[4618]: I0121 09:05:32.538924 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.537326 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.537378 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.537500 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.539036 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.539819 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.539841 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 09:05:33 crc kubenswrapper[4618]: I0121 09:05:33.540176 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.698311 4618 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.718156 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cj7pg"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.718558 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.721173 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.721962 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.722689 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.723194 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.723485 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.726057 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727270 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727440 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727623 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727636 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.727654 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727645 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727948 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727624 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.728006 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727678 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728361 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727687 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728033 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728410 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727689 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727700 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728480 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727706 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728504 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727723 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728532 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727756 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727781 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.727790 4618 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728627 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.727857 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.728288 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728685 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.728294 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.728707 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.728317 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.728815 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b7sq5"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.729127 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.729553 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kt5l4"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.729961 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.730254 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.730607 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.730661 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.730922 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.734211 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.734240 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.736121 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.736165 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.736126 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.736180 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.736190 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.736269 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.736290 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: W0121 09:05:34.736297 4618 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Jan 21 09:05:34 crc kubenswrapper[4618]: E0121 09:05:34.736314 4618 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.736509 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.736550 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.737249 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qf46"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.737627 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.737827 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738282 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738330 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738405 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738562 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738758 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.738770 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.739110 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.739411 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.739783 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.740722 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.740884 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.740927 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.740944 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741094 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741095 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741570 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741732 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741846 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.741941 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.742454 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.742755 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.743039 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.744068 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4jk5f"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.744342 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.744562 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.745264 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.745600 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.745707 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.745965 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746504 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746537 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746621 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746624 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746678 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.746891 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.747738 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw6g"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.748055 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749310 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749453 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749564 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749684 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749777 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.749922 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.750058 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.750239 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.750329 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.750417 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.753628 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.754847 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.755079 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756339 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756686 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756742 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756760 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756812 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756909 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.756976 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.757068 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.757328 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.757469 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.757625 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.757631 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.758042 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.758410 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.758775 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.759094 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.764118 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.764428 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.765039 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.765100 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.765175 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.765198 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.768640 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.768883 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.772087 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.772472 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.775225 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzz2k"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.775688 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.777640 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.780687 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.782088 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.782625 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.782877 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.782924 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.782997 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.783508 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.784314 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.784803 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785098 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785485 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785706 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prprb"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785738 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.785778 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.786071 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.786268 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.788010 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.789346 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.789559 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.792194 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.799042 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.801188 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.801514 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.802694 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.802875 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.804270 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.806731 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.806808 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-sknqd"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.808131 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cj7pg"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.809677 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.810248 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.810528 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.811119 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.814609 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.815801 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.816624 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.817431 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.818108 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.819049 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.821159 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dp9f8"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.821881 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.822308 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.822335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.822382 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.822752 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.823630 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.824768 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-888z8"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.825343 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.825756 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.825917 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.826056 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.826357 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.828767 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.829919 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.831027 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.832489 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.832609 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b7sq5"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.833851 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.835243 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.836133 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzz2k"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.837246 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mf758"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.838013 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.838341 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.839304 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.840606 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.840641 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.841122 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.841924 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.842711 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.846077 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.846178 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kt5l4"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.848339 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4jk5f"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.852589 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw6g"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.852611 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.852999 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853035 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853052 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853068 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7gs\" (UniqueName: \"kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853083 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853096 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61398e4c-d6b9-4a84-9d33-b53e45349442-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853111 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853124 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853158 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853174 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853190 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22gzd\" (UniqueName: \"kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853205 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed732303-6d90-47f7-ada6-c88b84249ddb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853220 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7g8\" (UniqueName: \"kubernetes.io/projected/676dae90-1358-4195-a506-0d4bc4b651db-kube-api-access-kg7g8\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853232 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2a17f8b-948b-41dc-af06-6c98af1134fe-serving-cert\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853247 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed732303-6d90-47f7-ada6-c88b84249ddb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853261 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwdv\" (UniqueName: \"kubernetes.io/projected/212ffb51-33a2-4282-afed-de31b0da9d84-kube-api-access-bmwdv\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853277 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853292 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgrw\" (UniqueName: \"kubernetes.io/projected/b2a17f8b-948b-41dc-af06-6c98af1134fe-kube-api-access-hhgrw\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853305 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853319 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853332 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853348 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b52b45bc-5ace-4daa-8548-030f576ece0f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853362 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c57370f-f724-43f3-91c9-03a98c087966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853389 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853403 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-config\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853417 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676dae90-1358-4195-a506-0d4bc4b651db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853430 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-config\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853445 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853460 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853474 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853487 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntxm\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-kube-api-access-8ntxm\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853502 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-service-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853515 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853538 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853552 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853566 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76w9l\" (UniqueName: \"kubernetes.io/projected/ed732303-6d90-47f7-ada6-c88b84249ddb-kube-api-access-76w9l\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853584 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853596 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676dae90-1358-4195-a506-0d4bc4b651db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853610 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853623 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncr2d\" (UniqueName: \"kubernetes.io/projected/f4159e19-631f-40cf-b53f-fb42d9171a06-kube-api-access-ncr2d\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853635 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lk2\" (UniqueName: \"kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853650 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853664 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlsn8\" (UniqueName: \"kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853677 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853692 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853705 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh8bp\" (UniqueName: \"kubernetes.io/projected/6c57370f-f724-43f3-91c9-03a98c087966-kube-api-access-qh8bp\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853718 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853733 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212ffb51-33a2-4282-afed-de31b0da9d84-serving-cert\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853747 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4r5z\" (UniqueName: \"kubernetes.io/projected/b3d29e61-443b-4040-b68b-4ad190ea08be-kube-api-access-j4r5z\") pod \"downloads-7954f5f757-4jk5f\" (UID: \"b3d29e61-443b-4040-b68b-4ad190ea08be\") " pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853762 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-config\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853776 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853789 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-trusted-ca\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853801 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853813 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853827 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853841 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853854 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-images\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853867 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmqf\" (UniqueName: \"kubernetes.io/projected/b52b45bc-5ace-4daa-8548-030f576ece0f-kube-api-access-pcmqf\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853880 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/61398e4c-d6b9-4a84-9d33-b53e45349442-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.853893 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.854366 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.854381 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qf46"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.855899 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.856714 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.857565 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.858823 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-888z8"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.859704 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.860013 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.860890 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.862444 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.864368 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prprb"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.865375 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dp9f8"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.866967 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mf758"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.867784 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.868967 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.874634 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.876816 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.878316 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.879255 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-542mv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.879677 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.879894 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.880255 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dcdxm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.880813 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.881245 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-542mv"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.882049 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dcdxm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.899650 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.939763 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.944517 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cmnsm"] Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.944939 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954852 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954880 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntxm\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-kube-api-access-8ntxm\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954902 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-service-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954918 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954946 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954959 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954974 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76w9l\" (UniqueName: \"kubernetes.io/projected/ed732303-6d90-47f7-ada6-c88b84249ddb-kube-api-access-76w9l\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.954995 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955010 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676dae90-1358-4195-a506-0d4bc4b651db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955033 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955049 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncr2d\" (UniqueName: \"kubernetes.io/projected/f4159e19-631f-40cf-b53f-fb42d9171a06-kube-api-access-ncr2d\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955062 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lk2\" (UniqueName: \"kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955076 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955092 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlsn8\" (UniqueName: \"kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955114 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955132 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955160 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh8bp\" (UniqueName: \"kubernetes.io/projected/6c57370f-f724-43f3-91c9-03a98c087966-kube-api-access-qh8bp\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955175 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955192 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212ffb51-33a2-4282-afed-de31b0da9d84-serving-cert\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955213 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4r5z\" (UniqueName: \"kubernetes.io/projected/b3d29e61-443b-4040-b68b-4ad190ea08be-kube-api-access-j4r5z\") pod \"downloads-7954f5f757-4jk5f\" (UID: \"b3d29e61-443b-4040-b68b-4ad190ea08be\") " pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955228 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-config\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955244 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955258 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-trusted-ca\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955273 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955289 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955303 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955315 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955332 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-images\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955348 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmqf\" (UniqueName: \"kubernetes.io/projected/b52b45bc-5ace-4daa-8548-030f576ece0f-kube-api-access-pcmqf\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955375 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/61398e4c-d6b9-4a84-9d33-b53e45349442-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955392 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955409 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955425 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955439 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955465 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7gs\" (UniqueName: \"kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955481 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955495 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61398e4c-d6b9-4a84-9d33-b53e45349442-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955512 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955527 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955544 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955551 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-service-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955559 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955576 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22gzd\" (UniqueName: \"kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955591 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed732303-6d90-47f7-ada6-c88b84249ddb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955606 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7g8\" (UniqueName: \"kubernetes.io/projected/676dae90-1358-4195-a506-0d4bc4b651db-kube-api-access-kg7g8\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955623 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2a17f8b-948b-41dc-af06-6c98af1134fe-serving-cert\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955641 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed732303-6d90-47f7-ada6-c88b84249ddb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955655 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwdv\" (UniqueName: \"kubernetes.io/projected/212ffb51-33a2-4282-afed-de31b0da9d84-kube-api-access-bmwdv\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955670 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955686 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgrw\" (UniqueName: \"kubernetes.io/projected/b2a17f8b-948b-41dc-af06-6c98af1134fe-kube-api-access-hhgrw\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955691 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955702 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955734 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955756 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955777 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b52b45bc-5ace-4daa-8548-030f576ece0f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955794 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c57370f-f724-43f3-91c9-03a98c087966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955827 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955881 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-config\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955889 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955898 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676dae90-1358-4195-a506-0d4bc4b651db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955916 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-config\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955934 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.955951 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.956290 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.956359 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.956426 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.956591 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61398e4c-d6b9-4a84-9d33-b53e45349442-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.957016 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.957053 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.958079 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.958805 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.959173 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.959669 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.959787 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.959910 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed732303-6d90-47f7-ada6-c88b84249ddb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960099 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960101 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960236 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-config\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960470 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960520 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676dae90-1358-4195-a506-0d4bc4b651db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960550 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212ffb51-33a2-4282-afed-de31b0da9d84-config\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960600 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-config\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960613 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b52b45bc-5ace-4daa-8548-030f576ece0f-images\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960895 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.960936 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.961349 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a17f8b-948b-41dc-af06-6c98af1134fe-trusted-ca\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.961532 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.961649 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676dae90-1358-4195-a506-0d4bc4b651db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.961654 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.962284 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212ffb51-33a2-4282-afed-de31b0da9d84-serving-cert\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.962470 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.962565 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.962706 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c57370f-f724-43f3-91c9-03a98c087966-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.962755 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963195 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2a17f8b-948b-41dc-af06-6c98af1134fe-serving-cert\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963202 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963193 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963387 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed732303-6d90-47f7-ada6-c88b84249ddb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963655 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.963667 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.964280 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.964430 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/61398e4c-d6b9-4a84-9d33-b53e45349442-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.964468 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.964474 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b52b45bc-5ace-4daa-8548-030f576ece0f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:34 crc kubenswrapper[4618]: I0121 09:05:34.979916 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.000362 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.020193 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.039609 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.059840 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.080309 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.099925 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.119778 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.140311 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.160039 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.179715 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.200396 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.220015 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.240512 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.265837 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.280315 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.299837 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.320646 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.340455 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.359633 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.380167 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.400505 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.419888 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.439571 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.460442 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.479531 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.500353 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.520233 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.539717 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.560158 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.580667 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.600537 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.640371 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.660437 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.680274 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.699844 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.720461 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.740257 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.759910 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.780595 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.800345 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.819189 4618 request.go:700] Waited for 1.00922966s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.819938 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.840390 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.860197 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.879538 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.900271 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.920124 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.939694 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.956388 4618 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.956458 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls podName:f4159e19-631f-40cf-b53f-fb42d9171a06 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:36.456441403 +0000 UTC m=+135.206908730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls") pod "machine-approver-56656f9798-8flgq" (UID: "f4159e19-631f-40cf-b53f-fb42d9171a06") : failed to sync secret cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.956495 4618 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.956544 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config podName:f4159e19-631f-40cf-b53f-fb42d9171a06 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:36.456530981 +0000 UTC m=+135.206998299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config") pod "machine-approver-56656f9798-8flgq" (UID: "f4159e19-631f-40cf-b53f-fb42d9171a06") : failed to sync configmap cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.959271 4618 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: E0121 09:05:35.959342 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config podName:f4159e19-631f-40cf-b53f-fb42d9171a06 nodeName:}" failed. No retries permitted until 2026-01-21 09:05:36.459327572 +0000 UTC m=+135.209794889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config") pod "machine-approver-56656f9798-8flgq" (UID: "f4159e19-631f-40cf-b53f-fb42d9171a06") : failed to sync configmap cache: timed out waiting for the condition Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.959386 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.979896 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 09:05:35 crc kubenswrapper[4618]: I0121 09:05:35.999720 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.019749 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.040356 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.059721 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.079788 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.099998 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.120269 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.139873 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.160258 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.180116 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.200190 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.220198 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.239608 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.264129 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.280261 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.300117 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.320325 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.340430 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.360400 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.380450 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.400412 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.420301 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.440573 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.460191 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.469344 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.469408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.469440 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.480164 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.500654 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.519925 4618 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.539903 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.559651 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.580054 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.600341 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.620039 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.640445 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.659805 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.680245 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.700648 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.740338 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.759526 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771626 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771655 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8m48\" (UniqueName: \"kubernetes.io/projected/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-kube-api-access-p8m48\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771672 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxwk\" (UniqueName: \"kubernetes.io/projected/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-kube-api-access-sfxwk\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771687 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-serving-cert\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771717 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-image-import-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771731 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771771 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq8b9\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771803 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771820 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-client\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771835 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwjm\" (UniqueName: \"kubernetes.io/projected/1d586623-ff45-4d91-8afb-328f6f392a39-kube-api-access-5kwjm\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771893 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771913 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-node-pullsecrets\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771927 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d586623-ff45-4d91-8afb-328f6f392a39-audit-dir\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771948 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771965 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-serving-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.771979 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit-dir\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772001 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-serving-cert\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772096 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772185 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: E0121 09:05:36.772206 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.272196841 +0000 UTC m=+136.022664158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-encryption-config\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772255 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772274 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772299 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772313 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-encryption-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772333 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-audit-policies\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772377 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772418 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/628bce78-0626-4d43-af74-56d40f41679a-metrics-tls\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772447 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-client\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772460 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlzm8\" (UniqueName: \"kubernetes.io/projected/628bce78-0626-4d43-af74-56d40f41679a-kube-api-access-jlzm8\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772476 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772489 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.772504 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.779742 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.811407 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntxm\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-kube-api-access-8ntxm\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.819189 4618 request.go:700] Waited for 1.863921875s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.831075 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76w9l\" (UniqueName: \"kubernetes.io/projected/ed732303-6d90-47f7-ada6-c88b84249ddb-kube-api-access-76w9l\") pod \"openshift-apiserver-operator-796bbdcf4f-6n2rc\" (UID: \"ed732303-6d90-47f7-ada6-c88b84249ddb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.850401 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh8bp\" (UniqueName: \"kubernetes.io/projected/6c57370f-f724-43f3-91c9-03a98c087966-kube-api-access-qh8bp\") pod \"cluster-samples-operator-665b6dd947-5dd7c\" (UID: \"6c57370f-f724-43f3-91c9-03a98c087966\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.870946 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7g8\" (UniqueName: \"kubernetes.io/projected/676dae90-1358-4195-a506-0d4bc4b651db-kube-api-access-kg7g8\") pod \"openshift-controller-manager-operator-756b6f6bc6-qvpkf\" (UID: \"676dae90-1358-4195-a506-0d4bc4b651db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.872864 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.872966 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: E0121 09:05:36.873090 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.373071229 +0000 UTC m=+136.123538546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873375 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55b440b2-92ea-462a-97f8-d59f7d92880a-cert\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq8b9\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873426 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-images\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873454 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-client\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873482 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwjm\" (UniqueName: \"kubernetes.io/projected/1d586623-ff45-4d91-8afb-328f6f392a39-kube-api-access-5kwjm\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873501 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-service-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873515 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-plugins-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873543 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-metrics-tls\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873566 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873580 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873594 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rkr\" (UniqueName: \"kubernetes.io/projected/ce155415-eec3-4b54-be9f-5e0729c8f1a2-kube-api-access-58rkr\") pod \"migrator-59844c95c7-4lfh9\" (UID: \"ce155415-eec3-4b54-be9f-5e0729c8f1a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873608 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873621 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz7k\" (UniqueName: \"kubernetes.io/projected/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-kube-api-access-khz7k\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873636 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit-dir\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873651 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09bb023-d629-46af-bc03-a760dbdec6ff-proxy-tls\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873665 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdb8s\" (UniqueName: \"kubernetes.io/projected/b2bf9a75-0044-4ca7-822e-db64a24c6c74-kube-api-access-rdb8s\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873678 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9261bd75-b91b-490f-ad63-ac8e49832d51-proxy-tls\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873693 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k227f\" (UniqueName: \"kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873707 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67d4m\" (UniqueName: \"kubernetes.io/projected/8ba2e944-8a22-4324-ba71-bca1844cb472-kube-api-access-67d4m\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873720 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-registration-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873744 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-key\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873757 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-srv-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873781 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873808 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873834 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-tmpfs\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873856 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-config\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873879 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jcs\" (UniqueName: \"kubernetes.io/projected/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-kube-api-access-g7jcs\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873908 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873923 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873937 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xktb\" (UniqueName: \"kubernetes.io/projected/a09bb023-d629-46af-bc03-a760dbdec6ff-kube-api-access-5xktb\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.873963 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbg26\" (UniqueName: \"kubernetes.io/projected/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-kube-api-access-dbg26\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874002 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-config\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874029 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz67k\" (UniqueName: \"kubernetes.io/projected/3dfa28fb-8191-475f-880a-1da9f1f88d85-kube-api-access-sz67k\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874057 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30a8096c-9962-4989-9811-54a9522f4e2e-service-ca-bundle\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874070 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874089 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrxh\" (UniqueName: \"kubernetes.io/projected/981120c7-979b-4459-a147-49279a72f3a8-kube-api-access-5hrxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874132 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874214 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlzm8\" (UniqueName: \"kubernetes.io/projected/628bce78-0626-4d43-af74-56d40f41679a-kube-api-access-jlzm8\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874322 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit-dir\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874699 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874712 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfa28fb-8191-475f-880a-1da9f1f88d85-serving-cert\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874743 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: E0121 09:05:36.874849 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.374839679 +0000 UTC m=+136.125306997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874877 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-srv-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874911 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5ghv\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-kube-api-access-l5ghv\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874931 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874970 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-serving-cert\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.874987 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-image-import-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bcs\" (UniqueName: \"kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-audit\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875178 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6fq\" (UniqueName: \"kubernetes.io/projected/55b440b2-92ea-462a-97f8-d59f7d92880a-kube-api-access-qj6fq\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875212 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmnq\" (UniqueName: \"kubernetes.io/projected/6177812c-28b6-4b42-92ba-0dab630aa890-kube-api-access-6kmnq\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875228 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qlqw\" (UniqueName: \"kubernetes.io/projected/b98c0d00-f768-4198-b363-7fd9aa977d2c-kube-api-access-2qlqw\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875278 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9ms\" (UniqueName: \"kubernetes.io/projected/62410b31-13fa-4314-8140-89abdac98679-kube-api-access-8d9ms\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875296 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cb72ca-7028-470b-a465-61bd4cf812e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875316 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875331 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-config-volume\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875349 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-node-pullsecrets\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875375 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d586623-ff45-4d91-8afb-328f6f392a39-audit-dir\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875393 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-cabundle\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875408 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dh4\" (UniqueName: \"kubernetes.io/projected/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-kube-api-access-t6dh4\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875422 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-serving-cert\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875436 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-client\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875487 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d586623-ff45-4d91-8afb-328f6f392a39-audit-dir\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875514 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-serving-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875521 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-node-pullsecrets\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875539 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-default-certificate\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-serving-cert\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875671 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875722 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-image-import-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875774 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-node-bootstrap-token\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.875961 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876042 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876056 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876092 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgfg\" (UniqueName: \"kubernetes.io/projected/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-kube-api-access-wzgfg\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876292 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfa28fb-8191-475f-880a-1da9f1f88d85-config\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876498 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876590 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981120c7-979b-4459-a147-49279a72f3a8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876656 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876703 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-encryption-config\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876754 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-serving-ca\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876793 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65cb72ca-7028-470b-a465-61bd4cf812e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876797 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876855 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876893 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876962 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981120c7-979b-4459-a147-49279a72f3a8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.876978 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877003 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877009 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877069 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk7gp\" (UniqueName: \"kubernetes.io/projected/473c594e-bb80-4800-9eee-61fdc502cd5b-kube-api-access-kk7gp\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877152 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-encryption-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877173 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-stats-auth\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877215 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-audit-policies\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877236 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877262 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-metrics-certs\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877280 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d344a8-e5dd-4c6b-8229-61db4a629d3a-trusted-ca\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877322 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877476 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/628bce78-0626-4d43-af74-56d40f41679a-metrics-tls\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877523 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-certs\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877605 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d344a8-e5dd-4c6b-8229-61db4a629d3a-metrics-tls\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877631 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-csi-data-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877693 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-socket-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877747 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-etcd-client\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877764 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnpvz\" (UniqueName: \"kubernetes.io/projected/9261bd75-b91b-490f-ad63-ac8e49832d51-kube-api-access-tnpvz\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877783 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877799 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877931 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-serving-cert\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877968 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cb72ca-7028-470b-a465-61bd4cf812e1-config\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.877984 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-mountpoint-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878011 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-client\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878055 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba2e944-8a22-4324-ba71-bca1844cb472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878077 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqg7\" (UniqueName: \"kubernetes.io/projected/30a8096c-9962-4989-9811-54a9522f4e2e-kube-api-access-zvqg7\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878101 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878118 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878349 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-webhook-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878387 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9261bd75-b91b-490f-ad63-ac8e49832d51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878438 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878511 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8m48\" (UniqueName: \"kubernetes.io/projected/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-kube-api-access-p8m48\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.878539 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxwk\" (UniqueName: \"kubernetes.io/projected/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-kube-api-access-sfxwk\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.879426 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.879599 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/628bce78-0626-4d43-af74-56d40f41679a-metrics-tls\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.879629 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-encryption-config\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.880208 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.880639 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.889516 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61398e4c-d6b9-4a84-9d33-b53e45349442-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qr5fr\" (UID: \"61398e4c-d6b9-4a84-9d33-b53e45349442\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.911135 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlsn8\" (UniqueName: \"kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8\") pod \"console-f9d7485db-dd2fv\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.918025 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.923521 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.931053 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwdv\" (UniqueName: \"kubernetes.io/projected/212ffb51-33a2-4282-afed-de31b0da9d84-kube-api-access-bmwdv\") pod \"authentication-operator-69f744f599-b7sq5\" (UID: \"212ffb51-33a2-4282-afed-de31b0da9d84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.935932 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.952453 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.964216 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.973872 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lk2\" (UniqueName: \"kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2\") pod \"oauth-openshift-558db77b4-fnvx2\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.979980 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:36 crc kubenswrapper[4618]: E0121 09:05:36.980087 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.480066028 +0000 UTC m=+136.230533345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980128 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980162 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5ghv\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-kube-api-access-l5ghv\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980193 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bcs\" (UniqueName: \"kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980211 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qlqw\" (UniqueName: \"kubernetes.io/projected/b98c0d00-f768-4198-b363-7fd9aa977d2c-kube-api-access-2qlqw\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980226 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6fq\" (UniqueName: \"kubernetes.io/projected/55b440b2-92ea-462a-97f8-d59f7d92880a-kube-api-access-qj6fq\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980241 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmnq\" (UniqueName: \"kubernetes.io/projected/6177812c-28b6-4b42-92ba-0dab630aa890-kube-api-access-6kmnq\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980256 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9ms\" (UniqueName: \"kubernetes.io/projected/62410b31-13fa-4314-8140-89abdac98679-kube-api-access-8d9ms\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980270 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cb72ca-7028-470b-a465-61bd4cf812e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980288 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-cabundle\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980304 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dh4\" (UniqueName: \"kubernetes.io/projected/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-kube-api-access-t6dh4\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980324 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980339 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-config-volume\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980354 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-default-certificate\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980367 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-serving-cert\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980379 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-client\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980392 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-node-bootstrap-token\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980411 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980425 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfa28fb-8191-475f-880a-1da9f1f88d85-config\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980440 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980455 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgfg\" (UniqueName: \"kubernetes.io/projected/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-kube-api-access-wzgfg\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980475 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981120c7-979b-4459-a147-49279a72f3a8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980514 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65cb72ca-7028-470b-a465-61bd4cf812e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980529 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk7gp\" (UniqueName: \"kubernetes.io/projected/473c594e-bb80-4800-9eee-61fdc502cd5b-kube-api-access-kk7gp\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980544 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981120c7-979b-4459-a147-49279a72f3a8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980560 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980575 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-stats-auth\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980594 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-metrics-certs\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980607 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d344a8-e5dd-4c6b-8229-61db4a629d3a-trusted-ca\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980623 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980651 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-certs\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980664 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d344a8-e5dd-4c6b-8229-61db4a629d3a-metrics-tls\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980678 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-csi-data-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980692 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-socket-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980707 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnpvz\" (UniqueName: \"kubernetes.io/projected/9261bd75-b91b-490f-ad63-ac8e49832d51-kube-api-access-tnpvz\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980723 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980739 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980754 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cb72ca-7028-470b-a465-61bd4cf812e1-config\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980774 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba2e944-8a22-4324-ba71-bca1844cb472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980788 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-mountpoint-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.980806 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqg7\" (UniqueName: \"kubernetes.io/projected/30a8096c-9962-4989-9811-54a9522f4e2e-kube-api-access-zvqg7\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.981673 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfa28fb-8191-475f-880a-1da9f1f88d85-config\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.981802 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-cabundle\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.981968 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-webhook-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.981989 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9261bd75-b91b-490f-ad63-ac8e49832d51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982050 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55b440b2-92ea-462a-97f8-d59f7d92880a-cert\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982107 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-config-volume\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982185 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-images\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982250 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-service-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982285 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-plugins-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982300 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-metrics-tls\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982389 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982433 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982667 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rkr\" (UniqueName: \"kubernetes.io/projected/ce155415-eec3-4b54-be9f-5e0729c8f1a2-kube-api-access-58rkr\") pod \"migrator-59844c95c7-4lfh9\" (UID: \"ce155415-eec3-4b54-be9f-5e0729c8f1a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982685 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982686 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cb72ca-7028-470b-a465-61bd4cf812e1-config\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982699 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khz7k\" (UniqueName: \"kubernetes.io/projected/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-kube-api-access-khz7k\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982732 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09bb023-d629-46af-bc03-a760dbdec6ff-proxy-tls\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982752 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdb8s\" (UniqueName: \"kubernetes.io/projected/b2bf9a75-0044-4ca7-822e-db64a24c6c74-kube-api-access-rdb8s\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982770 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9261bd75-b91b-490f-ad63-ac8e49832d51-proxy-tls\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982785 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-registration-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982809 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-key\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982826 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-srv-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982840 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982858 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k227f\" (UniqueName: \"kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982873 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67d4m\" (UniqueName: \"kubernetes.io/projected/8ba2e944-8a22-4324-ba71-bca1844cb472-kube-api-access-67d4m\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982894 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982914 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-tmpfs\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982931 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-config\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982950 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982965 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jcs\" (UniqueName: \"kubernetes.io/projected/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-kube-api-access-g7jcs\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982980 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xktb\" (UniqueName: \"kubernetes.io/projected/a09bb023-d629-46af-bc03-a760dbdec6ff-kube-api-access-5xktb\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.982998 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbg26\" (UniqueName: \"kubernetes.io/projected/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-kube-api-access-dbg26\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983033 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-config\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983049 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz67k\" (UniqueName: \"kubernetes.io/projected/3dfa28fb-8191-475f-880a-1da9f1f88d85-kube-api-access-sz67k\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983064 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30a8096c-9962-4989-9811-54a9522f4e2e-service-ca-bundle\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983079 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983096 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983117 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrxh\" (UniqueName: \"kubernetes.io/projected/981120c7-979b-4459-a147-49279a72f3a8-kube-api-access-5hrxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983132 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983499 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfa28fb-8191-475f-880a-1da9f1f88d85-serving-cert\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983522 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-srv-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983613 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-node-bootstrap-token\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.983905 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-csi-data-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.984081 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-service-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.984130 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-mountpoint-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.984313 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cb72ca-7028-470b-a465-61bd4cf812e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.984438 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-socket-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.985461 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.985874 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-default-certificate\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.985933 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-config\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.986480 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.986778 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.987282 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-tmpfs\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.987672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.987799 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-serving-cert\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.987871 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30a8096c-9962-4989-9811-54a9522f4e2e-service-ca-bundle\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988216 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-ca\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988288 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988592 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988643 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/62410b31-13fa-4314-8140-89abdac98679-srv-cert\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988683 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/473c594e-bb80-4800-9eee-61fdc502cd5b-signing-key\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.988739 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-registration-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.989076 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981120c7-979b-4459-a147-49279a72f3a8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.989220 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6177812c-28b6-4b42-92ba-0dab630aa890-config\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.989327 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b98c0d00-f768-4198-b363-7fd9aa977d2c-plugins-dir\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.989541 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.989989 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9261bd75-b91b-490f-ad63-ac8e49832d51-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.990428 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a09bb023-d629-46af-bc03-a760dbdec6ff-images\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.993495 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-srv-cert\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.993877 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6177812c-28b6-4b42-92ba-0dab630aa890-etcd-client\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.994421 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b2bf9a75-0044-4ca7-822e-db64a24c6c74-certs\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.994669 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22gzd\" (UniqueName: \"kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd\") pod \"route-controller-manager-6576b87f9c-gqzfs\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.994757 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-webhook-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:36 crc kubenswrapper[4618]: E0121 09:05:36.995135 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.495123916 +0000 UTC m=+136.245591233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.997513 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09bb023-d629-46af-bc03-a760dbdec6ff-proxy-tls\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.998112 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.998534 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d344a8-e5dd-4c6b-8229-61db4a629d3a-metrics-tls\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.999722 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d344a8-e5dd-4c6b-8229-61db4a629d3a-trusted-ca\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:36 crc kubenswrapper[4618]: I0121 09:05:36.999754 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-metrics-certs\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.000087 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfa28fb-8191-475f-880a-1da9f1f88d85-serving-cert\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.000379 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981120c7-979b-4459-a147-49279a72f3a8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.005331 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/30a8096c-9962-4989-9811-54a9522f4e2e-stats-auth\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.005473 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.005702 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9261bd75-b91b-490f-ad63-ac8e49832d51-proxy-tls\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.005708 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.005735 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-metrics-tls\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.006287 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.006560 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55b440b2-92ea-462a-97f8-d59f7d92880a-cert\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.006792 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.006798 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba2e944-8a22-4324-ba71-bca1844cb472-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.012044 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.015290 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4r5z\" (UniqueName: \"kubernetes.io/projected/b3d29e61-443b-4040-b68b-4ad190ea08be-kube-api-access-j4r5z\") pod \"downloads-7954f5f757-4jk5f\" (UID: \"b3d29e61-443b-4040-b68b-4ad190ea08be\") " pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.032644 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmqf\" (UniqueName: \"kubernetes.io/projected/b52b45bc-5ace-4daa-8548-030f576ece0f-kube-api-access-pcmqf\") pod \"machine-api-operator-5694c8668f-kt5l4\" (UID: \"b52b45bc-5ace-4daa-8548-030f576ece0f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.050213 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.052130 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgrw\" (UniqueName: \"kubernetes.io/projected/b2a17f8b-948b-41dc-af06-6c98af1134fe-kube-api-access-hhgrw\") pod \"console-operator-58897d9998-2zw6g\" (UID: \"b2a17f8b-948b-41dc-af06-6c98af1134fe\") " pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.071673 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7gs\" (UniqueName: \"kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs\") pod \"controller-manager-879f6c89f-ck8kg\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.080528 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.084109 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.084239 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.584217329 +0000 UTC m=+136.334684645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.084453 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.084667 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.584660561 +0000 UTC m=+136.335127878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.099747 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.109581 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-encryption-config\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.139907 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.159870 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.160667 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.169982 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.179823 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.181665 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.185557 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.185683 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.685668661 +0000 UTC m=+136.436135978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.186099 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.186369 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.686358076 +0000 UTC m=+136.436825392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.190282 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.200949 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.205930 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.209910 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.219993 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.230294 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4159e19-631f-40cf-b53f-fb42d9171a06-auth-proxy-config\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.240382 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.242289 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.243656 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncr2d\" (UniqueName: \"kubernetes.io/projected/f4159e19-631f-40cf-b53f-fb42d9171a06-kube-api-access-ncr2d\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.258159 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.260249 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.268322 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-serving-cert\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.269492 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.274812 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.282006 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.287587 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.287956 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.289720 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr"] Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.289792 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47473970_6704_4bb3_83fb_eee0a9db5552.slice/crio-a885e00f0550c6c4edf6d1578bb60b345da9b2b776a478c90eb8748381eb3087 WatchSource:0}: Error finding container a885e00f0550c6c4edf6d1578bb60b345da9b2b776a478c90eb8748381eb3087: Status 404 returned error can't find the container with id a885e00f0550c6c4edf6d1578bb60b345da9b2b776a478c90eb8748381eb3087 Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.290241 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.790201629 +0000 UTC m=+136.540668946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.291495 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-client\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.299881 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.301773 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61398e4c_d6b9_4a84_9d33_b53e45349442.slice/crio-59e1b939b907d845084b88bc5fd836d76f1f4735fef5707fdb950024caf587a8 WatchSource:0}: Error finding container 59e1b939b907d845084b88bc5fd836d76f1f4735fef5707fdb950024caf587a8: Status 404 returned error can't find the container with id 59e1b939b907d845084b88bc5fd836d76f1f4735fef5707fdb950024caf587a8 Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.321348 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.327702 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.328651 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.341803 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.351921 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4159e19-631f-40cf-b53f-fb42d9171a06-machine-approver-tls\") pod \"machine-approver-56656f9798-8flgq\" (UID: \"f4159e19-631f-40cf-b53f-fb42d9171a06\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.363758 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.364296 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.380439 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.388287 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d586623-ff45-4d91-8afb-328f6f392a39-audit-policies\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.389088 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.389541 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.889531628 +0000 UTC m=+136.639998935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.417554 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.433875 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4jk5f"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.435839 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwjm\" (UniqueName: \"kubernetes.io/projected/1d586623-ff45-4d91-8afb-328f6f392a39-kube-api-access-5kwjm\") pod \"apiserver-7bbb656c7d-wvh8n\" (UID: \"1d586623-ff45-4d91-8afb-328f6f392a39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.437724 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.441264 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d29e61_443b_4040_b68b_4ad190ea08be.slice/crio-1623dc22cc8f6ff417ca6bcedb66e24f5c1a89b83dd7d8b2d8a0261727a3f09d WatchSource:0}: Error finding container 1623dc22cc8f6ff417ca6bcedb66e24f5c1a89b83dd7d8b2d8a0261727a3f09d: Status 404 returned error can't find the container with id 1623dc22cc8f6ff417ca6bcedb66e24f5c1a89b83dd7d8b2d8a0261727a3f09d Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.450217 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.452885 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq8b9\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.455607 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw6g"] Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.463343 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a17f8b_948b_41dc_af06_6c98af1134fe.slice/crio-56fe0de26c8082a54e9bbba54929f8c9a7690be39e704ffe0504ef275774d112 WatchSource:0}: Error finding container 56fe0de26c8082a54e9bbba54929f8c9a7690be39e704ffe0504ef275774d112: Status 404 returned error can't find the container with id 56fe0de26c8082a54e9bbba54929f8c9a7690be39e704ffe0504ef275774d112 Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.471569 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlzm8\" (UniqueName: \"kubernetes.io/projected/628bce78-0626-4d43-af74-56d40f41679a-kube-api-access-jlzm8\") pod \"dns-operator-744455d44c-bzz2k\" (UID: \"628bce78-0626-4d43-af74-56d40f41679a\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.480956 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.489841 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.490127 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.99010411 +0000 UTC m=+136.740571427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.490457 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.490723 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:37.990711741 +0000 UTC m=+136.741179049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.491999 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8m48\" (UniqueName: \"kubernetes.io/projected/7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82-kube-api-access-p8m48\") pod \"apiserver-76f77b778f-cj7pg\" (UID: \"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82\") " pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.493945 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57e2a16d_ea83_4e99_844e_089ccba97f47.slice/crio-0b34bb744e5ece1e83c13da7480e6c48ed6027512711ba27ff867cf6609e60be WatchSource:0}: Error finding container 0b34bb744e5ece1e83c13da7480e6c48ed6027512711ba27ff867cf6609e60be: Status 404 returned error can't find the container with id 0b34bb744e5ece1e83c13da7480e6c48ed6027512711ba27ff867cf6609e60be Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.516936 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxwk\" (UniqueName: \"kubernetes.io/projected/b9341e4b-9c69-4d1d-90a7-13e6b9bc508d-kube-api-access-sfxwk\") pod \"openshift-config-operator-7777fb866f-2qf46\" (UID: \"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.529116 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.530475 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmnq\" (UniqueName: \"kubernetes.io/projected/6177812c-28b6-4b42-92ba-0dab630aa890-kube-api-access-6kmnq\") pod \"etcd-operator-b45778765-prprb\" (UID: \"6177812c-28b6-4b42-92ba-0dab630aa890\") " pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.531265 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.552985 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qlqw\" (UniqueName: \"kubernetes.io/projected/b98c0d00-f768-4198-b363-7fd9aa977d2c-kube-api-access-2qlqw\") pod \"csi-hostpathplugin-mf758\" (UID: \"b98c0d00-f768-4198-b363-7fd9aa977d2c\") " pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.571911 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b7sq5"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.572578 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kt5l4"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.573803 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6fq\" (UniqueName: \"kubernetes.io/projected/55b440b2-92ea-462a-97f8-d59f7d92880a-kube-api-access-qj6fq\") pod \"ingress-canary-542mv\" (UID: \"55b440b2-92ea-462a-97f8-d59f7d92880a\") " pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.583705 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.589266 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.591724 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.592248 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.09223573 +0000 UTC m=+136.842703047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.595854 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dznvh\" (UID: \"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.607262 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.613847 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.616320 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5ghv\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-kube-api-access-l5ghv\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.635678 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e333f8e7-fa2a-41d8-a917-dc9f00a556d5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bjndc\" (UID: \"e333f8e7-fa2a-41d8-a917-dc9f00a556d5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.661551 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9ms\" (UniqueName: \"kubernetes.io/projected/62410b31-13fa-4314-8140-89abdac98679-kube-api-access-8d9ms\") pod \"olm-operator-6b444d44fb-rz25z\" (UID: \"62410b31-13fa-4314-8140-89abdac98679\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.674469 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dh4\" (UniqueName: \"kubernetes.io/projected/fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b-kube-api-access-t6dh4\") pod \"multus-admission-controller-857f4d67dd-dp9f8\" (UID: \"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.693419 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qf46"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.694459 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.694746 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.194736431 +0000 UTC m=+136.945203749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.698616 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bcs\" (UniqueName: \"kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs\") pod \"collect-profiles-29483100-jx4hd\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.700034 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.714087 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgfg\" (UniqueName: \"kubernetes.io/projected/c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f-kube-api-access-wzgfg\") pod \"control-plane-machine-set-operator-78cbb6b69f-9t8g5\" (UID: \"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:37 crc kubenswrapper[4618]: W0121 09:05:37.716100 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9341e4b_9c69_4d1d_90a7_13e6b9bc508d.slice/crio-4ed3aebd3d21f8efa12e70f5a0d802650efdff24b86830fe2b6407f6968cb5fb WatchSource:0}: Error finding container 4ed3aebd3d21f8efa12e70f5a0d802650efdff24b86830fe2b6407f6968cb5fb: Status 404 returned error can't find the container with id 4ed3aebd3d21f8efa12e70f5a0d802650efdff24b86830fe2b6407f6968cb5fb Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.721800 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.730961 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqg7\" (UniqueName: \"kubernetes.io/projected/30a8096c-9962-4989-9811-54a9522f4e2e-kube-api-access-zvqg7\") pod \"router-default-5444994796-sknqd\" (UID: \"30a8096c-9962-4989-9811-54a9522f4e2e\") " pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.731085 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.735538 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.757246 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mf758" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.760722 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnpvz\" (UniqueName: \"kubernetes.io/projected/9261bd75-b91b-490f-ad63-ac8e49832d51-kube-api-access-tnpvz\") pod \"machine-config-controller-84d6567774-9dmlp\" (UID: \"9261bd75-b91b-490f-ad63-ac8e49832d51\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.764940 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-542mv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.772760 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzz2k"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.779568 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz67k\" (UniqueName: \"kubernetes.io/projected/3dfa28fb-8191-475f-880a-1da9f1f88d85-kube-api-access-sz67k\") pod \"service-ca-operator-777779d784-gwzmb\" (UID: \"3dfa28fb-8191-475f-880a-1da9f1f88d85\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.794355 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k227f\" (UniqueName: \"kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f\") pod \"marketplace-operator-79b997595-55flm\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.795518 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.795836 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.295822448 +0000 UTC m=+137.046289765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.814894 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67d4m\" (UniqueName: \"kubernetes.io/projected/8ba2e944-8a22-4324-ba71-bca1844cb472-kube-api-access-67d4m\") pod \"package-server-manager-789f6589d5-q99bm\" (UID: \"8ba2e944-8a22-4324-ba71-bca1844cb472\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.840221 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrxh\" (UniqueName: \"kubernetes.io/projected/981120c7-979b-4459-a147-49279a72f3a8-kube-api-access-5hrxh\") pod \"kube-storage-version-migrator-operator-b67b599dd-9sns4\" (UID: \"981120c7-979b-4459-a147-49279a72f3a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.863935 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xktb\" (UniqueName: \"kubernetes.io/projected/a09bb023-d629-46af-bc03-a760dbdec6ff-kube-api-access-5xktb\") pod \"machine-config-operator-74547568cd-bjn98\" (UID: \"a09bb023-d629-46af-bc03-a760dbdec6ff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.882370 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jcs\" (UniqueName: \"kubernetes.io/projected/f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f-kube-api-access-g7jcs\") pod \"catalog-operator-68c6474976-h7hcr\" (UID: \"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.889994 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.894949 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.896673 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.897096 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.397074997 +0000 UTC m=+137.147542304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.904726 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdb8s\" (UniqueName: \"kubernetes.io/projected/b2bf9a75-0044-4ca7-822e-db64a24c6c74-kube-api-access-rdb8s\") pod \"machine-config-server-cmnsm\" (UID: \"b2bf9a75-0044-4ca7-822e-db64a24c6c74\") " pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.917987 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbg26\" (UniqueName: \"kubernetes.io/projected/5feca275-ba2c-47ac-aba6-071b8fc7a6d9-kube-api-access-dbg26\") pod \"packageserver-d55dfcdfc-jvxsv\" (UID: \"5feca275-ba2c-47ac-aba6-071b8fc7a6d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.926960 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh"] Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.930178 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.936252 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.942250 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.943515 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53d344a8-e5dd-4c6b-8229-61db4a629d3a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gr4qn\" (UID: \"53d344a8-e5dd-4c6b-8229-61db4a629d3a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.965484 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4jk5f" event={"ID":"b3d29e61-443b-4040-b68b-4ad190ea08be","Type":"ContainerStarted","Data":"c97f26df99a063b487f35699405de4a3337370d9aba64640d6934835998c8136"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.965522 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4jk5f" event={"ID":"b3d29e61-443b-4040-b68b-4ad190ea08be","Type":"ContainerStarted","Data":"1623dc22cc8f6ff417ca6bcedb66e24f5c1a89b83dd7d8b2d8a0261727a3f09d"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.966180 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.966212 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.966476 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.970278 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rkr\" (UniqueName: \"kubernetes.io/projected/ce155415-eec3-4b54-be9f-5e0729c8f1a2-kube-api-access-58rkr\") pod \"migrator-59844c95c7-4lfh9\" (UID: \"ce155415-eec3-4b54-be9f-5e0729c8f1a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.971483 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" event={"ID":"6c57370f-f724-43f3-91c9-03a98c087966","Type":"ContainerStarted","Data":"4d65691a58fd2ff49ea6307bb4506bc232929bed028288fe433d0aff5ec77940"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.971509 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" event={"ID":"6c57370f-f724-43f3-91c9-03a98c087966","Type":"ContainerStarted","Data":"cdafcf98bc2377425c897320c0b33904651c07f84ca4bf0a5abcf995b4da7932"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.971522 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" event={"ID":"6c57370f-f724-43f3-91c9-03a98c087966","Type":"ContainerStarted","Data":"a594fbbd49f8f7bccb4bc64d69b5c1739f97177800e20033fab7a87013753a1a"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.971583 4618 patch_prober.go:28] interesting pod/downloads-7954f5f757-4jk5f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.971607 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4jk5f" podUID="b3d29e61-443b-4040-b68b-4ad190ea08be" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.974925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" event={"ID":"b2a17f8b-948b-41dc-af06-6c98af1134fe","Type":"ContainerStarted","Data":"5c2977be039a2c4cfd7fd3e47c01e5facc4e5f8f271443a0e363a80c1de9956a"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.974967 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" event={"ID":"b2a17f8b-948b-41dc-af06-6c98af1134fe","Type":"ContainerStarted","Data":"56fe0de26c8082a54e9bbba54929f8c9a7690be39e704ffe0504ef275774d112"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.975908 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.976809 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" event={"ID":"1d586623-ff45-4d91-8afb-328f6f392a39","Type":"ContainerStarted","Data":"99569f16d2cc267d15d04d3a4faeb9791aa73a43879d0c7b12035cd7af660759"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.977463 4618 patch_prober.go:28] interesting pod/console-operator-58897d9998-2zw6g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.977493 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" podUID="b2a17f8b-948b-41dc-af06-6c98af1134fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.978925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" event={"ID":"f4159e19-631f-40cf-b53f-fb42d9171a06","Type":"ContainerStarted","Data":"9637fbedcebdc480e03b7a3b3e2089440ce81ccebe8677e8a11ca2fed59e2da3"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.978952 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" event={"ID":"f4159e19-631f-40cf-b53f-fb42d9171a06","Type":"ContainerStarted","Data":"5346522a1b25410a5caeceb62dc8142286e510569d5d4bcfc9a25bab3d251c4a"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.982469 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dd2fv" event={"ID":"8da3ae7d-2af2-436f-85e8-542ae6eab03b","Type":"ContainerStarted","Data":"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.982500 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dd2fv" event={"ID":"8da3ae7d-2af2-436f-85e8-542ae6eab03b","Type":"ContainerStarted","Data":"bd3be72cf5180a54be071d83b1708ed1bf759b47d69699af4220900090cf9a10"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.984658 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.993925 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.996767 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" event={"ID":"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6","Type":"ContainerStarted","Data":"d279fa688589063f347b1f44daa69072d2e493dec3ebfdc374dbc46d9d13138b"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.996816 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" event={"ID":"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6","Type":"ContainerStarted","Data":"2ff492034a72fc4c1e44830bdf14476aeec334f1721ed155ad9f0ee761b4af7a"} Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.999128 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:37 crc kubenswrapper[4618]: E0121 09:05:37.999510 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.499497773 +0000 UTC m=+137.249965090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.999551 4618 request.go:700] Waited for 1.000574924s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/serviceaccounts/kube-controller-manager-operator/token Jan 21 09:05:37 crc kubenswrapper[4618]: I0121 09:05:37.999724 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.001618 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" event={"ID":"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d","Type":"ContainerStarted","Data":"4ed3aebd3d21f8efa12e70f5a0d802650efdff24b86830fe2b6407f6968cb5fb"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.002479 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.003211 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" event={"ID":"61398e4c-d6b9-4a84-9d33-b53e45349442","Type":"ContainerStarted","Data":"621e85aa31c28ef740e1fc4bad749412fbe846da7c86cc4ca639f9eff5737ed9"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.003239 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" event={"ID":"61398e4c-d6b9-4a84-9d33-b53e45349442","Type":"ContainerStarted","Data":"59e1b939b907d845084b88bc5fd836d76f1f4735fef5707fdb950024caf587a8"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.004456 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz7k\" (UniqueName: \"kubernetes.io/projected/93c7931a-f6cf-4936-ae5f-be56fc6b21ed-kube-api-access-khz7k\") pod \"dns-default-dcdxm\" (UID: \"93c7931a-f6cf-4936-ae5f-be56fc6b21ed\") " pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.006780 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.007080 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" event={"ID":"212ffb51-33a2-4282-afed-de31b0da9d84","Type":"ContainerStarted","Data":"408b5e14cc908d2077d9ec868ccfa246abd2151db6f84c4d51380353395703b0"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.007106 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" event={"ID":"212ffb51-33a2-4282-afed-de31b0da9d84","Type":"ContainerStarted","Data":"41b076d76ce92d234b7f35f4ca6f04ea690e9d9ef07f6b7ffcdb3697af0129a0"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.012049 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" event={"ID":"ed732303-6d90-47f7-ada6-c88b84249ddb","Type":"ContainerStarted","Data":"141699f5cc80b0db43934d04534cc06ed5479f36b4c166365d4ead156cf0eb11"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.012291 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" event={"ID":"ed732303-6d90-47f7-ada6-c88b84249ddb","Type":"ContainerStarted","Data":"9be37cef86586a92cc71ca53bdc13d43d1d38212688ff1ee11f2d85064887e5f"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.013945 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.014191 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk7gp\" (UniqueName: \"kubernetes.io/projected/473c594e-bb80-4800-9eee-61fdc502cd5b-kube-api-access-kk7gp\") pod \"service-ca-9c57cc56f-888z8\" (UID: \"473c594e-bb80-4800-9eee-61fdc502cd5b\") " pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.017333 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65cb72ca-7028-470b-a465-61bd4cf812e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dncfj\" (UID: \"65cb72ca-7028-470b-a465-61bd4cf812e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.017858 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" event={"ID":"676dae90-1358-4195-a506-0d4bc4b651db","Type":"ContainerStarted","Data":"3461368f311368c3bec3ced0df594a86e296d038c01a469cd26f6bfdf4fdf1a8"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.017888 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" event={"ID":"676dae90-1358-4195-a506-0d4bc4b651db","Type":"ContainerStarted","Data":"5e05f7cc01496f55c188655f3a08e271d124a80f9f5fd50c43ad00e6335b7e6b"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.021875 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" event={"ID":"b52b45bc-5ace-4daa-8548-030f576ece0f","Type":"ContainerStarted","Data":"64b3e70008564ef948e77db2a6fdaaadab8cf985c617c874628182430232badc"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.021913 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" event={"ID":"b52b45bc-5ace-4daa-8548-030f576ece0f","Type":"ContainerStarted","Data":"61d2aed87705f5838658152fa53a5aab3784e72ea45130fb9695b564ed524030"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.024280 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" event={"ID":"628bce78-0626-4d43-af74-56d40f41679a","Type":"ContainerStarted","Data":"21c6d148ddef36eedfa76af7407aec6ba1cb0ee3123189760fb22b15d6b3652e"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.025950 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" event={"ID":"57e2a16d-ea83-4e99-844e-089ccba97f47","Type":"ContainerStarted","Data":"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.025971 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" event={"ID":"57e2a16d-ea83-4e99-844e-089ccba97f47","Type":"ContainerStarted","Data":"0b34bb744e5ece1e83c13da7480e6c48ed6027512711ba27ff867cf6609e60be"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.026256 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.028607 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-888z8" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.030603 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" event={"ID":"47473970-6704-4bb3-83fb-eee0a9db5552","Type":"ContainerStarted","Data":"d588eb3c8b2eccc8ab3ce49175b53d2700785a61a11426ca838a97f63a47d383"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.030629 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" event={"ID":"47473970-6704-4bb3-83fb-eee0a9db5552","Type":"ContainerStarted","Data":"a885e00f0550c6c4edf6d1578bb60b345da9b2b776a478c90eb8748381eb3087"} Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.030868 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.069538 4618 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fnvx2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.069572 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.070775 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.075978 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.076182 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cmnsm" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.102684 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.107334 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.607319714 +0000 UTC m=+137.357787031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.175353 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dp9f8"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.190306 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-prprb"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.206265 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.206911 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.207794 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.707778392 +0000 UTC m=+137.458245709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.219783 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.224670 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.230850 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cj7pg"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.308696 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.308911 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.808902179 +0000 UTC m=+137.559369497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: W0121 09:05:38.340722 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6177812c_28b6_4b42_92ba_0dab630aa890.slice/crio-5c79fc6b54e4dfa645fdc0e9d741a4e8dd272f80010f13aae09cb27d39297544 WatchSource:0}: Error finding container 5c79fc6b54e4dfa645fdc0e9d741a4e8dd272f80010f13aae09cb27d39297544: Status 404 returned error can't find the container with id 5c79fc6b54e4dfa645fdc0e9d741a4e8dd272f80010f13aae09cb27d39297544 Jan 21 09:05:38 crc kubenswrapper[4618]: W0121 09:05:38.342983 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e68c4f9_c1f8_4c7a_83e8_6ed6a7dcdc82.slice/crio-76154dd0683581fb1bc3cb14ef8959c9a9b351967a0650d7fe636703a115ded9 WatchSource:0}: Error finding container 76154dd0683581fb1bc3cb14ef8959c9a9b351967a0650d7fe636703a115ded9: Status 404 returned error can't find the container with id 76154dd0683581fb1bc3cb14ef8959c9a9b351967a0650d7fe636703a115ded9 Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.362829 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.417456 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.417729 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:38.917716444 +0000 UTC m=+137.668183761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.519859 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.520268 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.020258104 +0000 UTC m=+137.770725421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.522191 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mf758"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.620530 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.620609 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.120592288 +0000 UTC m=+137.871059605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.620807 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.621050 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.121039929 +0000 UTC m=+137.871507245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.721480 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.721828 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.221816694 +0000 UTC m=+137.972284012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.823773 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.824085 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.324071234 +0000 UTC m=+138.074538551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.869908 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-542mv"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.870387 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd"] Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.925752 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:38 crc kubenswrapper[4618]: E0121 09:05:38.926195 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.426182646 +0000 UTC m=+138.176649962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:38 crc kubenswrapper[4618]: I0121 09:05:38.969647 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6n2rc" podStartSLOduration=116.969635189 podStartE2EDuration="1m56.969635189s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:38.968802535 +0000 UTC m=+137.719269852" watchObservedRunningTime="2026-01-21 09:05:38.969635189 +0000 UTC m=+137.720102506" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.034035 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.034297 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.534287247 +0000 UTC m=+138.284754565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.085544 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d586623-ff45-4d91-8afb-328f6f392a39" containerID="71e74505b67c77ccb5384068509f67b7f756be898627c5ae27751909e7b0cc32" exitCode=0 Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.085648 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" event={"ID":"1d586623-ff45-4d91-8afb-328f6f392a39","Type":"ContainerDied","Data":"71e74505b67c77ccb5384068509f67b7f756be898627c5ae27751909e7b0cc32"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.112350 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mf758" event={"ID":"b98c0d00-f768-4198-b363-7fd9aa977d2c","Type":"ContainerStarted","Data":"780eefb3f2d52d6d87747b3e3a0c7969c72c5d6e8ac9fb61b2ed06880a294d65"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.139470 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" event={"ID":"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f","Type":"ContainerStarted","Data":"6c95165381e09514912156d529cdf18cf132a40881dbe03d8df3f0ce4408ba81"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.139504 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" event={"ID":"c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f","Type":"ContainerStarted","Data":"49fce90e894dc47154a296bd5c71d16c74ccd345c1c642b957cbb7dfed50786e"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.140084 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.140425 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.640414447 +0000 UTC m=+138.390881764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.152677 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" event={"ID":"628bce78-0626-4d43-af74-56d40f41679a","Type":"ContainerStarted","Data":"69c356a36797258b6965d3b78edf156672f211a913d8eabbbc0d7efacaffbefb"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.161185 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" event={"ID":"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3","Type":"ContainerStarted","Data":"e49541065924cd1cbb4638f0b7c8d0f30f005ec4f46fdf2d7df1a89129b5d7b3"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.161233 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" event={"ID":"cbe12fe5-0811-4cdc-9ee7-3211f0c2fbe3","Type":"ContainerStarted","Data":"cdf670ec4f2a24be2c94f752f91a4e3b7966694c5d39a826211cbef061c598a7"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.175335 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" event={"ID":"b52b45bc-5ace-4daa-8548-030f576ece0f","Type":"ContainerStarted","Data":"b7a0a5791e01cb5ef83bece6cac87f87dff62d485f28ef29902513bae391c259"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.192609 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" event={"ID":"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b","Type":"ContainerStarted","Data":"7f3376883125dcd2e72a1d165031f6ac49757812b91e056f404bec6e3f1999eb"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.192650 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" event={"ID":"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b","Type":"ContainerStarted","Data":"32b7c3fe3452904621d973c65cc3e7c7be68a503e824d7aa9755fc6a64092b17"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.200230 4618 generic.go:334] "Generic (PLEG): container finished" podID="b9341e4b-9c69-4d1d-90a7-13e6b9bc508d" containerID="f015798679076b746762bcc4e8e845254ed727136e8d2f874404bbfe913c54b4" exitCode=0 Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.200285 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" event={"ID":"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d","Type":"ContainerDied","Data":"f015798679076b746762bcc4e8e845254ed727136e8d2f874404bbfe913c54b4"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.249794 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.250526 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmnsm" event={"ID":"b2bf9a75-0044-4ca7-822e-db64a24c6c74","Type":"ContainerStarted","Data":"ed81c2202aa32be0caac4b12daa1d8873ac62550a92d16061439685b5c6b1d2d"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.250567 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cmnsm" event={"ID":"b2bf9a75-0044-4ca7-822e-db64a24c6c74","Type":"ContainerStarted","Data":"04c2c1c7ef03d011daccd6e463415c9ca6d88bd72245f630a65ec4de106d9541"} Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.251363 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.751352178 +0000 UTC m=+138.501819496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.259175 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4jk5f" podStartSLOduration=117.259160707 podStartE2EDuration="1m57.259160707s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.222393775 +0000 UTC m=+137.972861093" watchObservedRunningTime="2026-01-21 09:05:39.259160707 +0000 UTC m=+138.009628014" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.296273 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" event={"ID":"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82","Type":"ContainerStarted","Data":"76154dd0683581fb1bc3cb14ef8959c9a9b351967a0650d7fe636703a115ded9"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.353584 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.354405 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.854387143 +0000 UTC m=+138.604854460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.384276 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" event={"ID":"6177812c-28b6-4b42-92ba-0dab630aa890","Type":"ContainerStarted","Data":"5c79fc6b54e4dfa645fdc0e9d741a4e8dd272f80010f13aae09cb27d39297544"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.405900 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sknqd" event={"ID":"30a8096c-9962-4989-9811-54a9522f4e2e","Type":"ContainerStarted","Data":"8aa40224c3307dfb7c013a6a4b78e5b14ffeb272be33d9b74459eed0c3668a79"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.405940 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sknqd" event={"ID":"30a8096c-9962-4989-9811-54a9522f4e2e","Type":"ContainerStarted","Data":"4607a52e2a00464518d32396f71bd6e2855c786ef92eae8777d30fc31b435c20"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.418074 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" podStartSLOduration=117.418052311 podStartE2EDuration="1m57.418052311s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.417827449 +0000 UTC m=+138.168294766" watchObservedRunningTime="2026-01-21 09:05:39.418052311 +0000 UTC m=+138.168519628" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.418261 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" podStartSLOduration=117.418257155 podStartE2EDuration="1m57.418257155s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.376648594 +0000 UTC m=+138.127115912" watchObservedRunningTime="2026-01-21 09:05:39.418257155 +0000 UTC m=+138.168724472" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.436835 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" event={"ID":"f4159e19-631f-40cf-b53f-fb42d9171a06","Type":"ContainerStarted","Data":"23d130f7f0fa6c2ee660873213ff68808335bb544b2ac2772af762d878f8a6c4"} Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.437430 4618 patch_prober.go:28] interesting pod/downloads-7954f5f757-4jk5f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.437452 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4jk5f" podUID="b3d29e61-443b-4040-b68b-4ad190ea08be" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.450689 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.454310 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2zw6g" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.454878 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.455935 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:39.955925539 +0000 UTC m=+138.706392857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.531635 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-b7sq5" podStartSLOduration=117.531622814 podStartE2EDuration="1m57.531622814s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.530680825 +0000 UTC m=+138.281148142" watchObservedRunningTime="2026-01-21 09:05:39.531622814 +0000 UTC m=+138.282090131" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.559247 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.561117 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.061098578 +0000 UTC m=+138.811565895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.599926 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qr5fr" podStartSLOduration=117.599912532 podStartE2EDuration="1m57.599912532s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.595705825 +0000 UTC m=+138.346173142" watchObservedRunningTime="2026-01-21 09:05:39.599912532 +0000 UTC m=+138.350379849" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.603286 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.604485 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.640751 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.662317 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.662627 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.162615864 +0000 UTC m=+138.913083181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.683485 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.689109 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.695646 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv"] Jan 21 09:05:39 crc kubenswrapper[4618]: W0121 09:05:39.698948 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfa28fb_8191_475f_880a_1da9f1f88d85.slice/crio-643673fb64c9eff2b6797b91bbfa430296735753ecbbe532633488e51541892c WatchSource:0}: Error finding container 643673fb64c9eff2b6797b91bbfa430296735753ecbbe532633488e51541892c: Status 404 returned error can't find the container with id 643673fb64c9eff2b6797b91bbfa430296735753ecbbe532633488e51541892c Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.705077 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" podStartSLOduration=116.70506367 podStartE2EDuration="1m56.70506367s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.694584487 +0000 UTC m=+138.445051804" watchObservedRunningTime="2026-01-21 09:05:39.70506367 +0000 UTC m=+138.455530987" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.705212 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.714102 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.730898 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.731210 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4"] Jan 21 09:05:39 crc kubenswrapper[4618]: W0121 09:05:39.734185 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce155415_eec3_4b54_be9f_5e0729c8f1a2.slice/crio-d85d375d77685b62e1971ea470a246e0867d246fb588695f49b4a5fa0ea15a78 WatchSource:0}: Error finding container d85d375d77685b62e1971ea470a246e0867d246fb588695f49b4a5fa0ea15a78: Status 404 returned error can't find the container with id d85d375d77685b62e1971ea470a246e0867d246fb588695f49b4a5fa0ea15a78 Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.741190 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dd7c" podStartSLOduration=117.741172947 podStartE2EDuration="1m57.741172947s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.734472639 +0000 UTC m=+138.484939956" watchObservedRunningTime="2026-01-21 09:05:39.741172947 +0000 UTC m=+138.491640264" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.742498 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-888z8"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.762934 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.763302 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.263287572 +0000 UTC m=+139.013754890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: W0121 09:05:39.765618 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473c594e_bb80_4800_9eee_61fdc502cd5b.slice/crio-405b1112e78d9d46e27dc4ce713cdb5915af32ada07c576a17dea4628cb38913 WatchSource:0}: Error finding container 405b1112e78d9d46e27dc4ce713cdb5915af32ada07c576a17dea4628cb38913: Status 404 returned error can't find the container with id 405b1112e78d9d46e27dc4ce713cdb5915af32ada07c576a17dea4628cb38913 Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.770922 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" podStartSLOduration=117.770911585 podStartE2EDuration="1m57.770911585s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.770785858 +0000 UTC m=+138.521253185" watchObservedRunningTime="2026-01-21 09:05:39.770911585 +0000 UTC m=+138.521378912" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.837728 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.852641 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dcdxm"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.864503 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.866103 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.365714967 +0000 UTC m=+139.116182283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.869988 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.919298 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn"] Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.944057 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:39 crc kubenswrapper[4618]: W0121 09:05:39.948638 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7867ab6f_4cdb_492d_9106_b1f42a66b62e.slice/crio-348a5cccdfc016a870cdcdeabe33352ad6734437168d47404d41164b063aee5d WatchSource:0}: Error finding container 348a5cccdfc016a870cdcdeabe33352ad6734437168d47404d41164b063aee5d: Status 404 returned error can't find the container with id 348a5cccdfc016a870cdcdeabe33352ad6734437168d47404d41164b063aee5d Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.950088 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:39 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:39 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:39 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.950127 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:39 crc kubenswrapper[4618]: I0121 09:05:39.964879 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:39 crc kubenswrapper[4618]: E0121 09:05:39.965220 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.465208524 +0000 UTC m=+139.215675841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:39 crc kubenswrapper[4618]: W0121 09:05:39.970506 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d344a8_e5dd_4c6b_8229_61db4a629d3a.slice/crio-1caee569396e608ba219b23e5b7b2a05cdd58b4fbd1c6164273686e4fb437138 WatchSource:0}: Error finding container 1caee569396e608ba219b23e5b7b2a05cdd58b4fbd1c6164273686e4fb437138: Status 404 returned error can't find the container with id 1caee569396e608ba219b23e5b7b2a05cdd58b4fbd1c6164273686e4fb437138 Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.007769 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qvpkf" podStartSLOduration=118.007753181 podStartE2EDuration="1m58.007753181s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.006905501 +0000 UTC m=+138.757372817" watchObservedRunningTime="2026-01-21 09:05:40.007753181 +0000 UTC m=+138.758220499" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.008234 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dd2fv" podStartSLOduration=118.008228784 podStartE2EDuration="1m58.008228784s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:39.977629922 +0000 UTC m=+138.728097239" watchObservedRunningTime="2026-01-21 09:05:40.008228784 +0000 UTC m=+138.758696101" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.068589 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.069210 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.569193722 +0000 UTC m=+139.319661040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.169842 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.175828 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.675808567 +0000 UTC m=+139.426275884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.239034 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" podStartSLOduration=118.239014743 podStartE2EDuration="1m58.239014743s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.18512101 +0000 UTC m=+138.935588327" watchObservedRunningTime="2026-01-21 09:05:40.239014743 +0000 UTC m=+138.989482060" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.254183 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kt5l4" podStartSLOduration=117.254169913 podStartE2EDuration="1m57.254169913s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.253569106 +0000 UTC m=+139.004036423" watchObservedRunningTime="2026-01-21 09:05:40.254169913 +0000 UTC m=+139.004637231" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.280513 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.280788 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.780778405 +0000 UTC m=+139.531245712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.374947 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cmnsm" podStartSLOduration=6.374930144 podStartE2EDuration="6.374930144s" podCreationTimestamp="2026-01-21 09:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.373486053 +0000 UTC m=+139.123953370" watchObservedRunningTime="2026-01-21 09:05:40.374930144 +0000 UTC m=+139.125397451" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.384556 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.384973 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.884959592 +0000 UTC m=+139.635426909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.460790 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8flgq" podStartSLOduration=118.460774247 podStartE2EDuration="1m58.460774247s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.406331573 +0000 UTC m=+139.156798889" watchObservedRunningTime="2026-01-21 09:05:40.460774247 +0000 UTC m=+139.211241564" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.467839 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" event={"ID":"9261bd75-b91b-490f-ad63-ac8e49832d51","Type":"ContainerStarted","Data":"3aea4232365be44813b61968f2d4bae05d4293e06fa2aa73bf634679937b6d36"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.475538 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-542mv" event={"ID":"55b440b2-92ea-462a-97f8-d59f7d92880a","Type":"ContainerStarted","Data":"93308d3e731208ec250591cda79e9d29ee3f079f14581741666097d452e1e877"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.475580 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-542mv" event={"ID":"55b440b2-92ea-462a-97f8-d59f7d92880a","Type":"ContainerStarted","Data":"aba0006bcae80c30ceaa984dbc6193c6076daa2ca1f91bc2a13715375e20a3a0"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.484574 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9t8g5" podStartSLOduration=117.484561702 podStartE2EDuration="1m57.484561702s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.483280367 +0000 UTC m=+139.233747684" watchObservedRunningTime="2026-01-21 09:05:40.484561702 +0000 UTC m=+139.235029010" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.486328 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.486558 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:40.986549605 +0000 UTC m=+139.737016921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.504743 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" event={"ID":"b9341e4b-9c69-4d1d-90a7-13e6b9bc508d","Type":"ContainerStarted","Data":"edb3e9eaf05dcc2effab1a137339d2d3d97256e77ff72853109035a12f4e5752"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.505279 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.506260 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dznvh" podStartSLOduration=118.506241452 podStartE2EDuration="1m58.506241452s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.505635384 +0000 UTC m=+139.256102701" watchObservedRunningTime="2026-01-21 09:05:40.506241452 +0000 UTC m=+139.256708768" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.506761 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-prprb" event={"ID":"6177812c-28b6-4b42-92ba-0dab630aa890","Type":"ContainerStarted","Data":"02ec0585f17ecae74935fdb2f1f5a711b541b68aa493d89319562762be8de3d0"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.535955 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" event={"ID":"65cb72ca-7028-470b-a465-61bd4cf812e1","Type":"ContainerStarted","Data":"af2953c27332a5af55d6486f0eb3e916f4e744bb6545d969c8ceed6c8d2f80ea"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.543660 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-sknqd" podStartSLOduration=118.543639848 podStartE2EDuration="1m58.543639848s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.542228569 +0000 UTC m=+139.292695886" watchObservedRunningTime="2026-01-21 09:05:40.543639848 +0000 UTC m=+139.294107165" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.543845 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" event={"ID":"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f","Type":"ContainerStarted","Data":"cbded314dbfbbb662256c5fa3f553487fdc841f911fcb49fc281014f41302f37"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.545082 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" event={"ID":"f2e21a51-1af8-4cc3-9df8-9cd3beb09a0f","Type":"ContainerStarted","Data":"6df2e2ab6d08174ffa2179d08052a6581ba17ee9c3b29565e007354e48edd50e"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.545104 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.547859 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" event={"ID":"a09bb023-d629-46af-bc03-a760dbdec6ff","Type":"ContainerStarted","Data":"455b4e3545ab5fdba079a05604965597d47ac4a16c0c7362647f8367df95e658"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.547887 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" event={"ID":"a09bb023-d629-46af-bc03-a760dbdec6ff","Type":"ContainerStarted","Data":"0729beb22f16c82c3a88fb10705363bcf3eb9f9fcce1216925e6705f24b286d1"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.550934 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dcdxm" event={"ID":"93c7931a-f6cf-4936-ae5f-be56fc6b21ed","Type":"ContainerStarted","Data":"c1c5d8a91fe0f6e29c8466a35568d42f42833daec6f44cd61adf4171690600fb"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.569524 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" event={"ID":"ce155415-eec3-4b54-be9f-5e0729c8f1a2","Type":"ContainerStarted","Data":"659c4877c36bc0ac68ba6b84ca52237e8939e838ae421db00c1feebefa1c7ab3"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.569567 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" event={"ID":"ce155415-eec3-4b54-be9f-5e0729c8f1a2","Type":"ContainerStarted","Data":"d85d375d77685b62e1971ea470a246e0867d246fb588695f49b4a5fa0ea15a78"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.574078 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.586771 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" event={"ID":"1d586623-ff45-4d91-8afb-328f6f392a39","Type":"ContainerStarted","Data":"28982bb822c5120c8391d78e4e5de25e146584c7f3e3aa850d878ada5886060f"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.587168 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.587450 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.087407002 +0000 UTC m=+139.837874319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.588726 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.589515 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.089483058 +0000 UTC m=+139.839950376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.599956 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" event={"ID":"5feca275-ba2c-47ac-aba6-071b8fc7a6d9","Type":"ContainerStarted","Data":"a7c56da630f1af27b743c0670bc587ec5e9a48e2e6530cd39db917f1b1251f7c"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.599988 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" event={"ID":"5feca275-ba2c-47ac-aba6-071b8fc7a6d9","Type":"ContainerStarted","Data":"734ebdd1b540c04c2801df7607106db84025a8735ed384dd9b54c25787e17a1f"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.600517 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.601266 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" event={"ID":"53d344a8-e5dd-4c6b-8229-61db4a629d3a","Type":"ContainerStarted","Data":"1caee569396e608ba219b23e5b7b2a05cdd58b4fbd1c6164273686e4fb437138"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.610674 4618 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jvxsv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.610715 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" podUID="5feca275-ba2c-47ac-aba6-071b8fc7a6d9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.611848 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" event={"ID":"981120c7-979b-4459-a147-49279a72f3a8","Type":"ContainerStarted","Data":"95a751f426556f2366877ef18b69e078274e8678db7180595c0e25079976b518"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.611906 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" event={"ID":"981120c7-979b-4459-a147-49279a72f3a8","Type":"ContainerStarted","Data":"78fae9c4b78a7d14b185e50d68d57acc254af1454204c78a4ccb68595e181b50"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.626108 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-542mv" podStartSLOduration=6.626090361 podStartE2EDuration="6.626090361s" podCreationTimestamp="2026-01-21 09:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.582874122 +0000 UTC m=+139.333341440" watchObservedRunningTime="2026-01-21 09:05:40.626090361 +0000 UTC m=+139.376557678" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.626282 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" podStartSLOduration=118.626278604 podStartE2EDuration="1m58.626278604s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.624808124 +0000 UTC m=+139.375275441" watchObservedRunningTime="2026-01-21 09:05:40.626278604 +0000 UTC m=+139.376745921" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.633614 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" event={"ID":"628bce78-0626-4d43-af74-56d40f41679a","Type":"ContainerStarted","Data":"7d59a74c4ef862c6b23667374172094a7ed9c5d5b9380b6b9c0c244217a428bb"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.652899 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" event={"ID":"e333f8e7-fa2a-41d8-a917-dc9f00a556d5","Type":"ContainerStarted","Data":"b7130e68bb2a9e0f5788b9a609e41ab6ad8adf669665a1cdc22d7a02b800873d"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.653043 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" event={"ID":"e333f8e7-fa2a-41d8-a917-dc9f00a556d5","Type":"ContainerStarted","Data":"895fac377b1f35495be3eb18f69060cff3b64af2a73d05e9f6bca505c26978b9"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.655908 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" podStartSLOduration=118.655895604 podStartE2EDuration="1m58.655895604s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.653744375 +0000 UTC m=+139.404211692" watchObservedRunningTime="2026-01-21 09:05:40.655895604 +0000 UTC m=+139.406362921" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.658330 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" event={"ID":"3dfa28fb-8191-475f-880a-1da9f1f88d85","Type":"ContainerStarted","Data":"db51ae900363240464e8f31764b10757e38b7133797ad5ca3b3e44de1a118ecc"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.659229 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" event={"ID":"3dfa28fb-8191-475f-880a-1da9f1f88d85","Type":"ContainerStarted","Data":"643673fb64c9eff2b6797b91bbfa430296735753ecbbe532633488e51541892c"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.661038 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" event={"ID":"fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b","Type":"ContainerStarted","Data":"109ada2bec99658d551ccd987875becf6aae829942aab3e165bbdea44f3fc07b"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.669939 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-888z8" event={"ID":"473c594e-bb80-4800-9eee-61fdc502cd5b","Type":"ContainerStarted","Data":"405b1112e78d9d46e27dc4ce713cdb5915af32ada07c576a17dea4628cb38913"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.690726 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" event={"ID":"62410b31-13fa-4314-8140-89abdac98679","Type":"ContainerStarted","Data":"90c4e7e69531cd961ec0232845e0928ab3f850b7887ca5b58915f0369dba44e9"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.690781 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" event={"ID":"62410b31-13fa-4314-8140-89abdac98679","Type":"ContainerStarted","Data":"8026e420c1c4996fbd9c781c48edde0d186b948a3590c33c31720ddb5222a3d5"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.691265 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.691863 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.693090 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.193074899 +0000 UTC m=+139.943542216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.696775 4618 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rz25z container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.696876 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" podUID="62410b31-13fa-4314-8140-89abdac98679" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.704863 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" event={"ID":"e06263ed-35a1-4532-ba9a-8521ec8a5b1d","Type":"ContainerStarted","Data":"fd7d181e949ef57516c19b2d6ac95a0bc112bbc4e32b560f2176adf303104b61"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.704990 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" event={"ID":"e06263ed-35a1-4532-ba9a-8521ec8a5b1d","Type":"ContainerStarted","Data":"740936a5041d529c880383e51c50ecf17c07c3dc6cb77b992a01e28c83d32d8c"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.722333 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mf758" event={"ID":"b98c0d00-f768-4198-b363-7fd9aa977d2c","Type":"ContainerStarted","Data":"63826ce99e0f76a438bf67e5ff86505057975b22adda8d01ef4b1b63a122f720"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.734266 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7hcr" podStartSLOduration=117.734255807 podStartE2EDuration="1m57.734255807s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.702776372 +0000 UTC m=+139.453243689" watchObservedRunningTime="2026-01-21 09:05:40.734255807 +0000 UTC m=+139.484723125" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.735956 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9sns4" podStartSLOduration=118.735948034 podStartE2EDuration="1m58.735948034s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.735416286 +0000 UTC m=+139.485883603" watchObservedRunningTime="2026-01-21 09:05:40.735948034 +0000 UTC m=+139.486415350" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.736050 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" event={"ID":"8ba2e944-8a22-4324-ba71-bca1844cb472","Type":"ContainerStarted","Data":"3f69d6e81ad10a6636c25e867ba7fd7b65ab1c79733ba8c4408211de8a1c41be"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.736233 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" event={"ID":"8ba2e944-8a22-4324-ba71-bca1844cb472","Type":"ContainerStarted","Data":"e4cebbb0c3ad045faea74bd349385d40de806dec3150d02b34d9c739e6edd630"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.736297 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.753656 4618 generic.go:334] "Generic (PLEG): container finished" podID="7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82" containerID="2ef95ae4f84f69e7d4f42616c8fb97c04c11439fa9fad3ba601a307249b73421" exitCode=0 Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.753734 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" event={"ID":"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82","Type":"ContainerStarted","Data":"470b5eea8039ceca9188ead9f56401777d7c5c55de8b5b2d3905fb6689c2a646"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.753767 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" event={"ID":"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82","Type":"ContainerDied","Data":"2ef95ae4f84f69e7d4f42616c8fb97c04c11439fa9fad3ba601a307249b73421"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.760671 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" event={"ID":"7867ab6f-4cdb-492d-9106-b1f42a66b62e","Type":"ContainerStarted","Data":"452a6e9e37b1b5ed72e29a0a789a22568e187987e0fe300d313f915d3551551a"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.760722 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" event={"ID":"7867ab6f-4cdb-492d-9106-b1f42a66b62e","Type":"ContainerStarted","Data":"348a5cccdfc016a870cdcdeabe33352ad6734437168d47404d41164b063aee5d"} Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.761412 4618 patch_prober.go:28] interesting pod/downloads-7954f5f757-4jk5f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.761445 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4jk5f" podUID="b3d29e61-443b-4040-b68b-4ad190ea08be" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.779233 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" podStartSLOduration=117.779213405 podStartE2EDuration="1m57.779213405s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.769319872 +0000 UTC m=+139.519787189" watchObservedRunningTime="2026-01-21 09:05:40.779213405 +0000 UTC m=+139.529680722" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.794595 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.797894 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.297861042 +0000 UTC m=+140.048328359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.815667 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" podStartSLOduration=117.81564663 podStartE2EDuration="1m57.81564663s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.81552913 +0000 UTC m=+139.565996447" watchObservedRunningTime="2026-01-21 09:05:40.81564663 +0000 UTC m=+139.566113948" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.863668 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" podStartSLOduration=117.863648472 podStartE2EDuration="1m57.863648472s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.862761046 +0000 UTC m=+139.613228354" watchObservedRunningTime="2026-01-21 09:05:40.863648472 +0000 UTC m=+139.614115789" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.896894 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:40 crc kubenswrapper[4618]: E0121 09:05:40.898589 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.398570361 +0000 UTC m=+140.149037677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.903389 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gwzmb" podStartSLOduration=117.903373648 podStartE2EDuration="1m57.903373648s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.901335151 +0000 UTC m=+139.651802467" watchObservedRunningTime="2026-01-21 09:05:40.903373648 +0000 UTC m=+139.653840965" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.934383 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" podStartSLOduration=117.934364816 podStartE2EDuration="1m57.934364816s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.933401778 +0000 UTC m=+139.683869096" watchObservedRunningTime="2026-01-21 09:05:40.934364816 +0000 UTC m=+139.684832134" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.948265 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:40 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:40 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:40 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.948494 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.969248 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dp9f8" podStartSLOduration=117.969234867 podStartE2EDuration="1m57.969234867s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:40.969215761 +0000 UTC m=+139.719683079" watchObservedRunningTime="2026-01-21 09:05:40.969234867 +0000 UTC m=+139.719702185" Jan 21 09:05:40 crc kubenswrapper[4618]: I0121 09:05:40.999850 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.000478 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.500465336 +0000 UTC m=+140.250932653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.018866 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" podStartSLOduration=119.01885077 podStartE2EDuration="1m59.01885077s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.016185878 +0000 UTC m=+139.766653195" watchObservedRunningTime="2026-01-21 09:05:41.01885077 +0000 UTC m=+139.769318087" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.051927 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" podStartSLOduration=119.051910692 podStartE2EDuration="1m59.051910692s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.051338779 +0000 UTC m=+139.801806096" watchObservedRunningTime="2026-01-21 09:05:41.051910692 +0000 UTC m=+139.802378000" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.100593 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.100886 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.600873529 +0000 UTC m=+140.351340847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.103478 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-888z8" podStartSLOduration=118.10345698 podStartE2EDuration="1m58.10345698s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.100308268 +0000 UTC m=+139.850775586" watchObservedRunningTime="2026-01-21 09:05:41.10345698 +0000 UTC m=+139.853924297" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.141994 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" podStartSLOduration=118.14198088 podStartE2EDuration="1m58.14198088s" podCreationTimestamp="2026-01-21 09:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.139988068 +0000 UTC m=+139.890455385" watchObservedRunningTime="2026-01-21 09:05:41.14198088 +0000 UTC m=+139.892448187" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.202466 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.202735 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.702725715 +0000 UTC m=+140.453193032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.215657 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bzz2k" podStartSLOduration=119.215636752 podStartE2EDuration="1m59.215636752s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.173062167 +0000 UTC m=+139.923529484" watchObservedRunningTime="2026-01-21 09:05:41.215636752 +0000 UTC m=+139.966104069" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.304038 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.304170 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.804154204 +0000 UTC m=+140.554621521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.304348 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.304597 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.804589501 +0000 UTC m=+140.555056818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.405536 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.405978 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:41.905967665 +0000 UTC m=+140.656434982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.506768 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.507105 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.007086223 +0000 UTC m=+140.757553540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.587790 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bjndc" podStartSLOduration=119.587775919 podStartE2EDuration="1m59.587775919s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.215588712 +0000 UTC m=+139.966056029" watchObservedRunningTime="2026-01-21 09:05:41.587775919 +0000 UTC m=+140.338243237" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.607901 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.608382 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.108370421 +0000 UTC m=+140.858837738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.710213 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.710464 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.210454391 +0000 UTC m=+140.960921709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.756062 4618 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.766942 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-888z8" event={"ID":"473c594e-bb80-4800-9eee-61fdc502cd5b","Type":"ContainerStarted","Data":"2cbe75c93561a129ee731ccfbe8c1ce379c146b2edd26570833adfae9de92f4a"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.768585 4618 generic.go:334] "Generic (PLEG): container finished" podID="e06263ed-35a1-4532-ba9a-8521ec8a5b1d" containerID="fd7d181e949ef57516c19b2d6ac95a0bc112bbc4e32b560f2176adf303104b61" exitCode=0 Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.768668 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" event={"ID":"e06263ed-35a1-4532-ba9a-8521ec8a5b1d","Type":"ContainerDied","Data":"fd7d181e949ef57516c19b2d6ac95a0bc112bbc4e32b560f2176adf303104b61"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.777330 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mf758" event={"ID":"b98c0d00-f768-4198-b363-7fd9aa977d2c","Type":"ContainerStarted","Data":"822208f76c3a0a9f3bd726a7c8fea4c9086aca247f700dd5137b0d9d6cb56532"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.777374 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mf758" event={"ID":"b98c0d00-f768-4198-b363-7fd9aa977d2c","Type":"ContainerStarted","Data":"dba696cd34b5d407bcc26aea1366ea21d8ab065ffe4eab5a0cde3cb25807a0be"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.777386 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mf758" event={"ID":"b98c0d00-f768-4198-b363-7fd9aa977d2c","Type":"ContainerStarted","Data":"b7b27008e42890e4ff042ef77c26032a80b9c2bd4a1adda0cc143a56eab02afc"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.783259 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" event={"ID":"65cb72ca-7028-470b-a465-61bd4cf812e1","Type":"ContainerStarted","Data":"90a084857f5c74b75dfbf4c2fd8b3d13205318a36d180345836045bf6b4e57a6"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.790159 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" event={"ID":"8ba2e944-8a22-4324-ba71-bca1844cb472","Type":"ContainerStarted","Data":"0c4c48617f0bced7aa0462d24f55ac2f0f5ec7cfaceb4bc3322079005cc08342"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.793498 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dcdxm" event={"ID":"93c7931a-f6cf-4936-ae5f-be56fc6b21ed","Type":"ContainerStarted","Data":"39c0474636722a251e28f398b3d1e5446201649eb49ae53fbe48caae577e5d20"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.793531 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dcdxm" event={"ID":"93c7931a-f6cf-4936-ae5f-be56fc6b21ed","Type":"ContainerStarted","Data":"3afcf044614921e5d224bdff59528aa02badcbbfb88835998c19b5e64157fc81"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.793902 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.796066 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" event={"ID":"ce155415-eec3-4b54-be9f-5e0729c8f1a2","Type":"ContainerStarted","Data":"65f007c26f93a3c39f49427c41161769c221b5f493c0a8fe0fbabb00cdbe8723"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.805550 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" event={"ID":"9261bd75-b91b-490f-ad63-ac8e49832d51","Type":"ContainerStarted","Data":"87c624ea178892b78feb3a1f972d1c25c2735469879c23f9922a73d6a5b0ab4e"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.805585 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" event={"ID":"9261bd75-b91b-490f-ad63-ac8e49832d51","Type":"ContainerStarted","Data":"3c861e11fd74dae93af762137f1f2c5173deec59f32c1834221894e94f813312"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.805862 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mf758" podStartSLOduration=7.805843492 podStartE2EDuration="7.805843492s" podCreationTimestamp="2026-01-21 09:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.794764124 +0000 UTC m=+140.545231440" watchObservedRunningTime="2026-01-21 09:05:41.805843492 +0000 UTC m=+140.556310810" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.810294 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" event={"ID":"53d344a8-e5dd-4c6b-8229-61db4a629d3a","Type":"ContainerStarted","Data":"63171f1086c6f59ff40c8449c8d00d350cc2f18e5ee9a563aec8d8196d17e1b9"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.810326 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" event={"ID":"53d344a8-e5dd-4c6b-8229-61db4a629d3a","Type":"ContainerStarted","Data":"ce035cf80fc2942ac30fbf1316205f96654cf524d650c7da57a8d3c03682216e"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.810703 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.811490 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.311471989 +0000 UTC m=+141.061939306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.811710 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.811925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjn98" event={"ID":"a09bb023-d629-46af-bc03-a760dbdec6ff","Type":"ContainerStarted","Data":"d3b66ff2acac8bef4dd8b304681ede54db8a34daacb5260e9c4012a7efcc9d82"} Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.812049 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.312034695 +0000 UTC m=+141.062502012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.817068 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" event={"ID":"7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82","Type":"ContainerStarted","Data":"b09c6291118130f546a8eea2387801ad8e8520724ba5d2672960956548a6a576"} Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.819853 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.817658 4618 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-55flm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.820055 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.823313 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dcdxm" podStartSLOduration=7.8232985710000005 podStartE2EDuration="7.823298571s" podCreationTimestamp="2026-01-21 09:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.822906475 +0000 UTC m=+140.573373792" watchObservedRunningTime="2026-01-21 09:05:41.823298571 +0000 UTC m=+140.573765887" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.826297 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jvxsv" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.829242 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rz25z" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.863176 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gr4qn" podStartSLOduration=119.863162687 podStartE2EDuration="1m59.863162687s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.862247989 +0000 UTC m=+140.612715306" watchObservedRunningTime="2026-01-21 09:05:41.863162687 +0000 UTC m=+140.613630003" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.865193 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dncfj" podStartSLOduration=119.865184031 podStartE2EDuration="1m59.865184031s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.842968096 +0000 UTC m=+140.593435412" watchObservedRunningTime="2026-01-21 09:05:41.865184031 +0000 UTC m=+140.615651349" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.913079 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.413058355 +0000 UTC m=+141.163525671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.912955 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.914120 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:41 crc kubenswrapper[4618]: E0121 09:05:41.920631 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.420609029 +0000 UTC m=+141.171076346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.937580 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4lfh9" podStartSLOduration=119.93755441 podStartE2EDuration="1m59.93755441s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.922282581 +0000 UTC m=+140.672749898" watchObservedRunningTime="2026-01-21 09:05:41.93755441 +0000 UTC m=+140.688021727" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.946823 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:41 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:41 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:41 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.946899 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:41 crc kubenswrapper[4618]: I0121 09:05:41.961828 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9dmlp" podStartSLOduration=119.961807129 podStartE2EDuration="1m59.961807129s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:41.960020175 +0000 UTC m=+140.710487492" watchObservedRunningTime="2026-01-21 09:05:41.961807129 +0000 UTC m=+140.712274447" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.012125 4618 csr.go:261] certificate signing request csr-qtt5t is approved, waiting to be issued Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.015740 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.015920 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.515894006 +0000 UTC m=+141.266361322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.016309 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.016634 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.516622923 +0000 UTC m=+141.267090240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.022255 4618 csr.go:257] certificate signing request csr-qtt5t is issued Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.117896 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.118394 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.618366755 +0000 UTC m=+141.368834072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.118539 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.118796 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.618789398 +0000 UTC m=+141.369256715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.219622 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.219816 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.719791026 +0000 UTC m=+141.470258344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.220169 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.220470 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.720463088 +0000 UTC m=+141.470930405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.301616 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.302423 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.307123 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.314825 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.320639 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.320796 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.820776584 +0000 UTC m=+141.571243901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.320922 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.321214 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.821201673 +0000 UTC m=+141.571668989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.421538 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.421714 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqhz8\" (UniqueName: \"kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.421757 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.421826 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.422037 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:42.921996742 +0000 UTC m=+141.672464059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.451057 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.451235 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.456884 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.499057 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.500828 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.504587 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.513890 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.522800 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.522896 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.522935 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.522973 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhz8\" (UniqueName: \"kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.523506 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:43.023489361 +0000 UTC m=+141.773956679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.524251 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.525276 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.555870 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhz8\" (UniqueName: \"kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8\") pod \"certified-operators-cwzlc\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.614591 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.623615 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.623808 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 09:05:43.123791296 +0000 UTC m=+141.874258614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.623906 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfv7\" (UniqueName: \"kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.624034 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.624114 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.624267 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: E0121 09:05:42.624616 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 09:05:43.124599763 +0000 UTC m=+141.875067081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bkf4x" (UID: "ad054293-342a-4919-b938-6032654fbc53") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.671873 4618 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-21T09:05:41.756087878Z","Handler":null,"Name":""} Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.674972 4618 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.675033 4618 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.707175 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.709841 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.725377 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.725905 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.726091 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.726115 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.726234 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfv7\" (UniqueName: \"kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.727313 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.733034 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.733443 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.734104 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.742320 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.744831 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfv7\" (UniqueName: \"kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7\") pod \"community-operators-qr6lr\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.749855 4618 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cj7pg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]log ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]etcd ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/max-in-flight-filter ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 21 09:05:42 crc kubenswrapper[4618]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 21 09:05:42 crc kubenswrapper[4618]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/project.openshift.io-projectcache ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/openshift.io-startinformers ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 21 09:05:42 crc kubenswrapper[4618]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 09:05:42 crc kubenswrapper[4618]: livez check failed Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.749942 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" podUID="7e68c4f9-c1f8-4c7a-83e8-6ed6a7dcdc82" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.775040 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:05:42 crc kubenswrapper[4618]: W0121 09:05:42.783268 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ec235a_04ac_489e_92cd_e1e69c8a1074.slice/crio-559d7e24931e29cff667e8f91a5b743504047817e20aefd28c0764c76d380bed WatchSource:0}: Error finding container 559d7e24931e29cff667e8f91a5b743504047817e20aefd28c0764c76d380bed: Status 404 returned error can't find the container with id 559d7e24931e29cff667e8f91a5b743504047817e20aefd28c0764c76d380bed Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.814822 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.826992 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.827175 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.827206 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.827243 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbrx\" (UniqueName: \"kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.829096 4618 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.829126 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.829557 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerStarted","Data":"559d7e24931e29cff667e8f91a5b743504047817e20aefd28c0764c76d380bed"} Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.834752 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.835635 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qf46" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.837357 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wvh8n" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.871344 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bkf4x\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.910770 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.911748 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.921056 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.928702 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.928804 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbrx\" (UniqueName: \"kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.928873 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.930670 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.932617 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.954360 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:42 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:42 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:42 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.954413 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.964657 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbrx\" (UniqueName: \"kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx\") pod \"certified-operators-gsw65\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:42 crc kubenswrapper[4618]: I0121 09:05:42.979360 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.024499 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-21 09:00:42 +0000 UTC, rotation deadline is 2026-10-24 12:34:21.068080309 +0000 UTC Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.024787 4618 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6627h28m38.043304445s for next certificate rotation Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.030856 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.030912 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.030949 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njcdc\" (UniqueName: \"kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.031230 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.132670 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.132728 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njcdc\" (UniqueName: \"kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.137119 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.137909 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.132791 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.143991 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.146486 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:43 crc kubenswrapper[4618]: W0121 09:05:43.152366 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad054293_342a_4919_b938_6032654fbc53.slice/crio-ee354de2302baa33d6a95142b735179115555b5aaf7a0faeca8bc072c4601a6d WatchSource:0}: Error finding container ee354de2302baa33d6a95142b735179115555b5aaf7a0faeca8bc072c4601a6d: Status 404 returned error can't find the container with id ee354de2302baa33d6a95142b735179115555b5aaf7a0faeca8bc072c4601a6d Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.156541 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njcdc\" (UniqueName: \"kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc\") pod \"community-operators-rvr4r\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.185303 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:05:43 crc kubenswrapper[4618]: W0121 09:05:43.205249 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc1d458_1c8f_4afb_b209_be769710ccf2.slice/crio-33aa2592641ea139182a8214db880ad7647339b80570383cde43d43908eae4a9 WatchSource:0}: Error finding container 33aa2592641ea139182a8214db880ad7647339b80570383cde43d43908eae4a9: Status 404 returned error can't find the container with id 33aa2592641ea139182a8214db880ad7647339b80570383cde43d43908eae4a9 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.230125 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.240038 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume\") pod \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.240092 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume\") pod \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.240129 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bcs\" (UniqueName: \"kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs\") pod \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\" (UID: \"e06263ed-35a1-4532-ba9a-8521ec8a5b1d\") " Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.240664 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e06263ed-35a1-4532-ba9a-8521ec8a5b1d" (UID: "e06263ed-35a1-4532-ba9a-8521ec8a5b1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.243614 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs" (OuterVolumeSpecName: "kube-api-access-j2bcs") pod "e06263ed-35a1-4532-ba9a-8521ec8a5b1d" (UID: "e06263ed-35a1-4532-ba9a-8521ec8a5b1d"). InnerVolumeSpecName "kube-api-access-j2bcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.245046 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e06263ed-35a1-4532-ba9a-8521ec8a5b1d" (UID: "e06263ed-35a1-4532-ba9a-8521ec8a5b1d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.278833 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.341416 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.341444 4618 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.341453 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bcs\" (UniqueName: \"kubernetes.io/projected/e06263ed-35a1-4532-ba9a-8521ec8a5b1d-kube-api-access-j2bcs\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.448430 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:05:43 crc kubenswrapper[4618]: W0121 09:05:43.513594 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71db5598_40a0_4583_88ee_7add145de7ac.slice/crio-ac1d620f2649ffba1e0b416cdc74297c05e43129371c21c11bd8eacb269ca149 WatchSource:0}: Error finding container ac1d620f2649ffba1e0b416cdc74297c05e43129371c21c11bd8eacb269ca149: Status 404 returned error can't find the container with id ac1d620f2649ffba1e0b416cdc74297c05e43129371c21c11bd8eacb269ca149 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.545880 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.834450 4618 generic.go:334] "Generic (PLEG): container finished" podID="71db5598-40a0-4583-88ee-7add145de7ac" containerID="1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228" exitCode=0 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.834548 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerDied","Data":"1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.835320 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerStarted","Data":"ac1d620f2649ffba1e0b416cdc74297c05e43129371c21c11bd8eacb269ca149"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.835776 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.836912 4618 generic.go:334] "Generic (PLEG): container finished" podID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerID="0d5f8b657611de154f50378be5e8f2d4d336eb8019911cacab7c438655120611" exitCode=0 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.836942 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerDied","Data":"0d5f8b657611de154f50378be5e8f2d4d336eb8019911cacab7c438655120611"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.836985 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerStarted","Data":"b70794f7d98e400c9d9c82da1afb31c35d76a6e1013b3b95bdd0a6ff3471bd39"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.838270 4618 generic.go:334] "Generic (PLEG): container finished" podID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerID="f875f165f57a949746f8fc769af7177f6da0358d1122111684e937c07f050bb2" exitCode=0 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.838328 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerDied","Data":"f875f165f57a949746f8fc769af7177f6da0358d1122111684e937c07f050bb2"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.839718 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.839725 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd" event={"ID":"e06263ed-35a1-4532-ba9a-8521ec8a5b1d","Type":"ContainerDied","Data":"740936a5041d529c880383e51c50ecf17c07c3dc6cb77b992a01e28c83d32d8c"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.839846 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740936a5041d529c880383e51c50ecf17c07c3dc6cb77b992a01e28c83d32d8c" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.841042 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" event={"ID":"ad054293-342a-4919-b938-6032654fbc53","Type":"ContainerStarted","Data":"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.841095 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.841105 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" event={"ID":"ad054293-342a-4919-b938-6032654fbc53","Type":"ContainerStarted","Data":"ee354de2302baa33d6a95142b735179115555b5aaf7a0faeca8bc072c4601a6d"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.843523 4618 generic.go:334] "Generic (PLEG): container finished" podID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerID="3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9" exitCode=0 Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.843616 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerDied","Data":"3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.843660 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerStarted","Data":"33aa2592641ea139182a8214db880ad7647339b80570383cde43d43908eae4a9"} Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.865754 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" podStartSLOduration=121.865740237 podStartE2EDuration="2m1.865740237s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:05:43.86337717 +0000 UTC m=+142.613844487" watchObservedRunningTime="2026-01-21 09:05:43.865740237 +0000 UTC m=+142.616207554" Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.946216 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:43 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:43 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:43 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:43 crc kubenswrapper[4618]: I0121 09:05:43.946284 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.497549 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:05:44 crc kubenswrapper[4618]: E0121 09:05:44.497763 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06263ed-35a1-4532-ba9a-8521ec8a5b1d" containerName="collect-profiles" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.497782 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06263ed-35a1-4532-ba9a-8521ec8a5b1d" containerName="collect-profiles" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.497882 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06263ed-35a1-4532-ba9a-8521ec8a5b1d" containerName="collect-profiles" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.498520 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.500083 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.512493 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.658407 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.658499 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhdp9\" (UniqueName: \"kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.658537 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.762218 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.762455 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdp9\" (UniqueName: \"kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.762554 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.762755 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.763035 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.782427 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhdp9\" (UniqueName: \"kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9\") pod \"redhat-marketplace-t62l5\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.826204 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.904929 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.906461 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.910395 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.944521 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:44 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:44 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:44 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:44 crc kubenswrapper[4618]: I0121 09:05:44.944574 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.065960 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hb9\" (UniqueName: \"kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.066125 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.066193 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.168325 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.168479 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hb9\" (UniqueName: \"kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.168584 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.169244 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.169307 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.184509 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hb9\" (UniqueName: \"kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9\") pod \"redhat-marketplace-sf7bm\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.254762 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.286151 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.459418 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:05:45 crc kubenswrapper[4618]: W0121 09:05:45.464676 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792d0f04_c700_4062_9eb3_a98b0d5e41d9.slice/crio-005998c143a4fd18ada78565cb3ea195b25e6de679c3b6e82468d80c28723006 WatchSource:0}: Error finding container 005998c143a4fd18ada78565cb3ea195b25e6de679c3b6e82468d80c28723006: Status 404 returned error can't find the container with id 005998c143a4fd18ada78565cb3ea195b25e6de679c3b6e82468d80c28723006 Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.503505 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.504671 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.507475 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.509720 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.574650 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.574719 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.574828 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rl5\" (UniqueName: \"kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.676234 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.676283 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.676362 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rl5\" (UniqueName: \"kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.676608 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.676769 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.693629 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rl5\" (UniqueName: \"kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5\") pod \"redhat-operators-8wcpt\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.867080 4618 generic.go:334] "Generic (PLEG): container finished" podID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerID="7c983e09b1696df8f740c8fad1dfe8f0c36e2333eef937ace1997bfc128d2470" exitCode=0 Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.867183 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerDied","Data":"7c983e09b1696df8f740c8fad1dfe8f0c36e2333eef937ace1997bfc128d2470"} Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.867263 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerStarted","Data":"5ce6d88a26a24a04a53b410db725b0989e541c2f7f713c6ec13b684650cd0b25"} Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.869915 4618 generic.go:334] "Generic (PLEG): container finished" podID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerID="8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d" exitCode=0 Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.869933 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerDied","Data":"8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d"} Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.869947 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerStarted","Data":"005998c143a4fd18ada78565cb3ea195b25e6de679c3b6e82468d80c28723006"} Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.903628 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.904528 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.907280 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.911917 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.947004 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:45 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:45 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:45 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.947101 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.980683 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqx4h\" (UniqueName: \"kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.980856 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:45 crc kubenswrapper[4618]: I0121 09:05:45.980961 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.082081 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqx4h\" (UniqueName: \"kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.082163 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.082222 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.082689 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.082684 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.100010 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqx4h\" (UniqueName: \"kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h\") pod \"redhat-operators-b8qzs\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.116280 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:05:46 crc kubenswrapper[4618]: W0121 09:05:46.128218 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4468290_050b_4a6a_9388_cbbae3c71d68.slice/crio-8b09f5eccd18505be023c5fc33be11f43cbc85fc4a5acd5e3d1c383538a75143 WatchSource:0}: Error finding container 8b09f5eccd18505be023c5fc33be11f43cbc85fc4a5acd5e3d1c383538a75143: Status 404 returned error can't find the container with id 8b09f5eccd18505be023c5fc33be11f43cbc85fc4a5acd5e3d1c383538a75143 Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.163760 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.165073 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.167868 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.168717 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.168981 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.225917 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.285152 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.285267 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.387311 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.387675 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.387766 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.405931 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.409825 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.481482 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.678473 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 09:05:46 crc kubenswrapper[4618]: W0121 09:05:46.680341 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3f440e26_4942_49af_a22b_ace9cce52301.slice/crio-73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477 WatchSource:0}: Error finding container 73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477: Status 404 returned error can't find the container with id 73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477 Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.880107 4618 generic.go:334] "Generic (PLEG): container finished" podID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerID="ecdbf89c996a4008bdc0711f7fbc98d3aa44dfc9c2515509c93569c5ee0f459c" exitCode=0 Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.880192 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerDied","Data":"ecdbf89c996a4008bdc0711f7fbc98d3aa44dfc9c2515509c93569c5ee0f459c"} Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.880220 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerStarted","Data":"2daac383ceae2fbabbc63e493cf3ee285b4ddee68a810b5ed39fdb90959cb470"} Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.883415 4618 generic.go:334] "Generic (PLEG): container finished" podID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerID="b7a7c0a85806504fe8ab35ba83b3635f0832feb7165b524a435ab4bd85c54a1f" exitCode=0 Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.883453 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerDied","Data":"b7a7c0a85806504fe8ab35ba83b3635f0832feb7165b524a435ab4bd85c54a1f"} Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.883468 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerStarted","Data":"8b09f5eccd18505be023c5fc33be11f43cbc85fc4a5acd5e3d1c383538a75143"} Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.891449 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3f440e26-4942-49af-a22b-ace9cce52301","Type":"ContainerStarted","Data":"73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477"} Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.945439 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:46 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:46 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:46 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.945583 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.965529 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.965753 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.967839 4618 patch_prober.go:28] interesting pod/console-f9d7485db-dd2fv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 21 09:05:46 crc kubenswrapper[4618]: I0121 09:05:46.967877 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dd2fv" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.248017 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4jk5f" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.402614 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.402668 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.406278 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.407723 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.504079 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.504152 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.513459 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.519627 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.646388 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.652450 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.659136 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.741073 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.751455 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cj7pg" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.910671 4618 generic.go:334] "Generic (PLEG): container finished" podID="3f440e26-4942-49af-a22b-ace9cce52301" containerID="68ca184c5a3bd04fa94fe8a139daf5a85bc208f84bdd64be025a252c34e7a21c" exitCode=0 Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.910794 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3f440e26-4942-49af-a22b-ace9cce52301","Type":"ContainerDied","Data":"68ca184c5a3bd04fa94fe8a139daf5a85bc208f84bdd64be025a252c34e7a21c"} Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.943284 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.946066 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:47 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:47 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:47 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:47 crc kubenswrapper[4618]: I0121 09:05:47.946168 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:48 crc kubenswrapper[4618]: I0121 09:05:48.944715 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:48 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:48 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:48 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:48 crc kubenswrapper[4618]: I0121 09:05:48.944767 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.095473 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.096246 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.098049 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.103886 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.104460 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.230075 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.230130 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.331484 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.331619 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.331732 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.348535 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.425519 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.944826 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:49 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:49 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:49 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:49 crc kubenswrapper[4618]: I0121 09:05:49.945120 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.074413 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dcdxm" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.629607 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.751748 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access\") pod \"3f440e26-4942-49af-a22b-ace9cce52301\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.752275 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir\") pod \"3f440e26-4942-49af-a22b-ace9cce52301\" (UID: \"3f440e26-4942-49af-a22b-ace9cce52301\") " Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.752472 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3f440e26-4942-49af-a22b-ace9cce52301" (UID: "3f440e26-4942-49af-a22b-ace9cce52301"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.756727 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3f440e26-4942-49af-a22b-ace9cce52301" (UID: "3f440e26-4942-49af-a22b-ace9cce52301"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.853737 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f440e26-4942-49af-a22b-ace9cce52301-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.853758 4618 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f440e26-4942-49af-a22b-ace9cce52301-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:50 crc kubenswrapper[4618]: W0121 09:05:50.913749 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4f651dabe90dfea787de7889fa90ca7452112480a4853bb0a1bc64ebc651b805 WatchSource:0}: Error finding container 4f651dabe90dfea787de7889fa90ca7452112480a4853bb0a1bc64ebc651b805: Status 404 returned error can't find the container with id 4f651dabe90dfea787de7889fa90ca7452112480a4853bb0a1bc64ebc651b805 Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.945469 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4f651dabe90dfea787de7889fa90ca7452112480a4853bb0a1bc64ebc651b805"} Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.953209 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:50 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:50 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:50 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.953260 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.954952 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.955052 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3f440e26-4942-49af-a22b-ace9cce52301","Type":"ContainerDied","Data":"73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477"} Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.955116 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73efd8562091ad3b3b8f951db439a7358cf1f6e95790adfe8f3d8449e5318477" Jan 21 09:05:50 crc kubenswrapper[4618]: I0121 09:05:50.987357 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 09:05:51 crc kubenswrapper[4618]: W0121 09:05:51.003834 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4921636a_eb03_4b1f_b95b_17856328414a.slice/crio-763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a WatchSource:0}: Error finding container 763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a: Status 404 returned error can't find the container with id 763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a Jan 21 09:05:51 crc kubenswrapper[4618]: W0121 09:05:51.004680 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-76ca81605c42f169e8fd20ed6e6e5ec27b0a19b5d77cfa1c422d130957776a16 WatchSource:0}: Error finding container 76ca81605c42f169e8fd20ed6e6e5ec27b0a19b5d77cfa1c422d130957776a16: Status 404 returned error can't find the container with id 76ca81605c42f169e8fd20ed6e6e5ec27b0a19b5d77cfa1c422d130957776a16 Jan 21 09:05:51 crc kubenswrapper[4618]: W0121 09:05:51.005442 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-bbd248c7251ff2de1da354641697725be96ef375da8b05305467f1425c33960e WatchSource:0}: Error finding container bbd248c7251ff2de1da354641697725be96ef375da8b05305467f1425c33960e: Status 404 returned error can't find the container with id bbd248c7251ff2de1da354641697725be96ef375da8b05305467f1425c33960e Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.949430 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:51 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:51 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:51 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.949799 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.970061 4618 generic.go:334] "Generic (PLEG): container finished" podID="4921636a-eb03-4b1f-b95b-17856328414a" containerID="8f3815259bdf9887734f5ba8fad6d94ee8247bcd8ec7f51c0a4eb77cd5a252bb" exitCode=0 Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.970120 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4921636a-eb03-4b1f-b95b-17856328414a","Type":"ContainerDied","Data":"8f3815259bdf9887734f5ba8fad6d94ee8247bcd8ec7f51c0a4eb77cd5a252bb"} Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.970172 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4921636a-eb03-4b1f-b95b-17856328414a","Type":"ContainerStarted","Data":"763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a"} Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.977512 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6d8b30ad0ab955e289c5b55b3db7113cdac024422f6af57d110b0900cf78e843"} Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.977609 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.990115 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"72e93d8248fc947e28702e669239792cf4e5e861d9e88a1f79ba5e09707629e0"} Jan 21 09:05:51 crc kubenswrapper[4618]: I0121 09:05:51.990163 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"76ca81605c42f169e8fd20ed6e6e5ec27b0a19b5d77cfa1c422d130957776a16"} Jan 21 09:05:52 crc kubenswrapper[4618]: I0121 09:05:52.007653 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d144665e35305e3e01b97cbb57dcea3aa9e756ce7552a2930eb2e82cac6731ef"} Jan 21 09:05:52 crc kubenswrapper[4618]: I0121 09:05:52.007722 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bbd248c7251ff2de1da354641697725be96ef375da8b05305467f1425c33960e"} Jan 21 09:05:52 crc kubenswrapper[4618]: I0121 09:05:52.944651 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:52 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:52 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:52 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:52 crc kubenswrapper[4618]: I0121 09:05:52.944947 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:53 crc kubenswrapper[4618]: I0121 09:05:53.945575 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:53 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:53 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:53 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:53 crc kubenswrapper[4618]: I0121 09:05:53.945666 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:54 crc kubenswrapper[4618]: I0121 09:05:54.944174 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:54 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:54 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:54 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:54 crc kubenswrapper[4618]: I0121 09:05:54.944232 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.422549 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.557356 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.660860 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access\") pod \"4921636a-eb03-4b1f-b95b-17856328414a\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.660909 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir\") pod \"4921636a-eb03-4b1f-b95b-17856328414a\" (UID: \"4921636a-eb03-4b1f-b95b-17856328414a\") " Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.661233 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4921636a-eb03-4b1f-b95b-17856328414a" (UID: "4921636a-eb03-4b1f-b95b-17856328414a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.665488 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4921636a-eb03-4b1f-b95b-17856328414a" (UID: "4921636a-eb03-4b1f-b95b-17856328414a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.762808 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4921636a-eb03-4b1f-b95b-17856328414a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.762833 4618 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4921636a-eb03-4b1f-b95b-17856328414a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.944591 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:55 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:55 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:55 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:55 crc kubenswrapper[4618]: I0121 09:05:55.944642 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.040909 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.040825 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4921636a-eb03-4b1f-b95b-17856328414a","Type":"ContainerDied","Data":"763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a"} Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.042245 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="763cd42cdcc056a00fedcb27196a980fcf52874483688e4219a4df8abd5a905a" Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.945265 4618 patch_prober.go:28] interesting pod/router-default-5444994796-sknqd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 09:05:56 crc kubenswrapper[4618]: [-]has-synced failed: reason withheld Jan 21 09:05:56 crc kubenswrapper[4618]: [+]process-running ok Jan 21 09:05:56 crc kubenswrapper[4618]: healthz check failed Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.945328 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sknqd" podUID="30a8096c-9962-4989-9811-54a9522f4e2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.959555 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.959649 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.965106 4618 patch_prober.go:28] interesting pod/console-f9d7485db-dd2fv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 21 09:05:56 crc kubenswrapper[4618]: I0121 09:05:56.965151 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dd2fv" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 21 09:05:57 crc kubenswrapper[4618]: I0121 09:05:57.945250 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:57 crc kubenswrapper[4618]: I0121 09:05:57.947587 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-sknqd" Jan 21 09:05:58 crc kubenswrapper[4618]: I0121 09:05:58.752397 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:05:58 crc kubenswrapper[4618]: I0121 09:05:58.752743 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" podUID="47473970-6704-4bb3-83fb-eee0a9db5552" containerName="controller-manager" containerID="cri-o://d588eb3c8b2eccc8ab3ce49175b53d2700785a61a11426ca838a97f63a47d383" gracePeriod=30 Jan 21 09:05:58 crc kubenswrapper[4618]: I0121 09:05:58.765097 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:05:58 crc kubenswrapper[4618]: I0121 09:05:58.765238 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" podUID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" containerName="route-controller-manager" containerID="cri-o://d279fa688589063f347b1f44daa69072d2e493dec3ebfdc374dbc46d9d13138b" gracePeriod=30 Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.058180 4618 generic.go:334] "Generic (PLEG): container finished" podID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" containerID="d279fa688589063f347b1f44daa69072d2e493dec3ebfdc374dbc46d9d13138b" exitCode=0 Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.058241 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" event={"ID":"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6","Type":"ContainerDied","Data":"d279fa688589063f347b1f44daa69072d2e493dec3ebfdc374dbc46d9d13138b"} Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.059760 4618 generic.go:334] "Generic (PLEG): container finished" podID="47473970-6704-4bb3-83fb-eee0a9db5552" containerID="d588eb3c8b2eccc8ab3ce49175b53d2700785a61a11426ca838a97f63a47d383" exitCode=0 Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.059808 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" event={"ID":"47473970-6704-4bb3-83fb-eee0a9db5552","Type":"ContainerDied","Data":"d588eb3c8b2eccc8ab3ce49175b53d2700785a61a11426ca838a97f63a47d383"} Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.470608 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.508648 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config\") pod \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.508748 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22gzd\" (UniqueName: \"kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd\") pod \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.508780 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert\") pod \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.508819 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca\") pod \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\" (UID: \"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.509484 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" (UID: "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.510357 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config" (OuterVolumeSpecName: "config") pod "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" (UID: "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.512045 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.513808 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd" (OuterVolumeSpecName: "kube-api-access-22gzd") pod "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" (UID: "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6"). InnerVolumeSpecName "kube-api-access-22gzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.518511 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" (UID: "c1d744cf-898c-4d77-986b-2e3ec0a7ccb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.610844 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22gzd\" (UniqueName: \"kubernetes.io/projected/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-kube-api-access-22gzd\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.610874 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.610900 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.610909 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.712176 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles\") pod \"47473970-6704-4bb3-83fb-eee0a9db5552\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.712247 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config\") pod \"47473970-6704-4bb3-83fb-eee0a9db5552\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.712293 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7gs\" (UniqueName: \"kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs\") pod \"47473970-6704-4bb3-83fb-eee0a9db5552\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.712313 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca\") pod \"47473970-6704-4bb3-83fb-eee0a9db5552\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.712392 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert\") pod \"47473970-6704-4bb3-83fb-eee0a9db5552\" (UID: \"47473970-6704-4bb3-83fb-eee0a9db5552\") " Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.713562 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47473970-6704-4bb3-83fb-eee0a9db5552" (UID: "47473970-6704-4bb3-83fb-eee0a9db5552"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.714468 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config" (OuterVolumeSpecName: "config") pod "47473970-6704-4bb3-83fb-eee0a9db5552" (UID: "47473970-6704-4bb3-83fb-eee0a9db5552"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.714517 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca" (OuterVolumeSpecName: "client-ca") pod "47473970-6704-4bb3-83fb-eee0a9db5552" (UID: "47473970-6704-4bb3-83fb-eee0a9db5552"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.715506 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs" (OuterVolumeSpecName: "kube-api-access-9f7gs") pod "47473970-6704-4bb3-83fb-eee0a9db5552" (UID: "47473970-6704-4bb3-83fb-eee0a9db5552"). InnerVolumeSpecName "kube-api-access-9f7gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.716130 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47473970-6704-4bb3-83fb-eee0a9db5552" (UID: "47473970-6704-4bb3-83fb-eee0a9db5552"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.813480 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.813508 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.813519 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f7gs\" (UniqueName: \"kubernetes.io/projected/47473970-6704-4bb3-83fb-eee0a9db5552-kube-api-access-9f7gs\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.813529 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47473970-6704-4bb3-83fb-eee0a9db5552-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.813538 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47473970-6704-4bb3-83fb-eee0a9db5552-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976298 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:05:59 crc kubenswrapper[4618]: E0121 09:05:59.976569 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f440e26-4942-49af-a22b-ace9cce52301" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976581 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f440e26-4942-49af-a22b-ace9cce52301" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: E0121 09:05:59.976591 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" containerName="route-controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976597 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" containerName="route-controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: E0121 09:05:59.976605 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4921636a-eb03-4b1f-b95b-17856328414a" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976610 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4921636a-eb03-4b1f-b95b-17856328414a" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: E0121 09:05:59.976625 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47473970-6704-4bb3-83fb-eee0a9db5552" containerName="controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976631 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="47473970-6704-4bb3-83fb-eee0a9db5552" containerName="controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976705 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f440e26-4942-49af-a22b-ace9cce52301" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976714 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" containerName="route-controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976720 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="4921636a-eb03-4b1f-b95b-17856328414a" containerName="pruner" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.976729 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="47473970-6704-4bb3-83fb-eee0a9db5552" containerName="controller-manager" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.977071 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.981636 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.982461 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.982544 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:05:59 crc kubenswrapper[4618]: I0121 09:05:59.986027 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.065760 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" event={"ID":"c1d744cf-898c-4d77-986b-2e3ec0a7ccb6","Type":"ContainerDied","Data":"2ff492034a72fc4c1e44830bdf14476aeec334f1721ed155ad9f0ee761b4af7a"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.065811 4618 scope.go:117] "RemoveContainer" containerID="d279fa688589063f347b1f44daa69072d2e493dec3ebfdc374dbc46d9d13138b" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.065933 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.069535 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.069534 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ck8kg" event={"ID":"47473970-6704-4bb3-83fb-eee0a9db5552","Type":"ContainerDied","Data":"a885e00f0550c6c4edf6d1578bb60b345da9b2b776a478c90eb8748381eb3087"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.071451 4618 generic.go:334] "Generic (PLEG): container finished" podID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerID="4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d" exitCode=0 Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.071537 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerDied","Data":"4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.073218 4618 generic.go:334] "Generic (PLEG): container finished" podID="71db5598-40a0-4583-88ee-7add145de7ac" containerID="8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f" exitCode=0 Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.073419 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerDied","Data":"8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.077359 4618 generic.go:334] "Generic (PLEG): container finished" podID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerID="1921801e602f242bf4d79e42f06e2355a29dcd078924473c6c9bf01064f8702a" exitCode=0 Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.077413 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerDied","Data":"1921801e602f242bf4d79e42f06e2355a29dcd078924473c6c9bf01064f8702a"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.078580 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.079410 4618 generic.go:334] "Generic (PLEG): container finished" podID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerID="5141c89656827dc6862cc75abb093172b412ffcb6e17a94eb048bb487f83b1fc" exitCode=0 Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.079447 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerDied","Data":"5141c89656827dc6862cc75abb093172b412ffcb6e17a94eb048bb487f83b1fc"} Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.082347 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gqzfs"] Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116627 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfj75\" (UniqueName: \"kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116680 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116705 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116728 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116799 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116816 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnmc\" (UniqueName: \"kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116844 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.116997 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.117096 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.148842 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.151258 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ck8kg"] Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223028 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223065 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnmc\" (UniqueName: \"kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223095 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223118 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223184 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223223 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfj75\" (UniqueName: \"kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223256 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223282 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.223300 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.224475 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.225559 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.226950 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.227047 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.227283 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.229624 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.229672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.237536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnmc\" (UniqueName: \"kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc\") pod \"controller-manager-5788dfcd64-ts25l\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.239121 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfj75\" (UniqueName: \"kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75\") pod \"route-controller-manager-d47948cf9-k4pg7\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.305189 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:00 crc kubenswrapper[4618]: I0121 09:06:00.311221 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:01 crc kubenswrapper[4618]: I0121 09:06:01.552019 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47473970-6704-4bb3-83fb-eee0a9db5552" path="/var/lib/kubelet/pods/47473970-6704-4bb3-83fb-eee0a9db5552/volumes" Jan 21 09:06:01 crc kubenswrapper[4618]: I0121 09:06:01.552519 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d744cf-898c-4d77-986b-2e3ec0a7ccb6" path="/var/lib/kubelet/pods/c1d744cf-898c-4d77-986b-2e3ec0a7ccb6/volumes" Jan 21 09:06:02 crc kubenswrapper[4618]: I0121 09:06:02.984062 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:06:03 crc kubenswrapper[4618]: I0121 09:06:03.809328 4618 scope.go:117] "RemoveContainer" containerID="d588eb3c8b2eccc8ab3ce49175b53d2700785a61a11426ca838a97f63a47d383" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.070841 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.139260 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerStarted","Data":"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b"} Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.168089 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerStarted","Data":"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6"} Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.168905 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsw65" podStartSLOduration=2.09321013 podStartE2EDuration="22.168885877s" podCreationTimestamp="2026-01-21 09:05:42 +0000 UTC" firstStartedPulling="2026-01-21 09:05:43.84542026 +0000 UTC m=+142.595887577" lastFinishedPulling="2026-01-21 09:06:03.921096006 +0000 UTC m=+162.671563324" observedRunningTime="2026-01-21 09:06:04.168280871 +0000 UTC m=+162.918748188" watchObservedRunningTime="2026-01-21 09:06:04.168885877 +0000 UTC m=+162.919353193" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.175095 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerStarted","Data":"7100f5900f56f994582a364d2fa1ec70980642ea217cdfcbf00f26ff4c6fa65f"} Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.188507 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvr4r" podStartSLOduration=2.138374888 podStartE2EDuration="22.188489799s" podCreationTimestamp="2026-01-21 09:05:42 +0000 UTC" firstStartedPulling="2026-01-21 09:05:43.835548979 +0000 UTC m=+142.586016297" lastFinishedPulling="2026-01-21 09:06:03.885663891 +0000 UTC m=+162.636131208" observedRunningTime="2026-01-21 09:06:04.185339164 +0000 UTC m=+162.935806481" watchObservedRunningTime="2026-01-21 09:06:04.188489799 +0000 UTC m=+162.938957116" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.203862 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qr6lr" podStartSLOduration=2.136580119 podStartE2EDuration="22.203841869s" podCreationTimestamp="2026-01-21 09:05:42 +0000 UTC" firstStartedPulling="2026-01-21 09:05:43.837795276 +0000 UTC m=+142.588262584" lastFinishedPulling="2026-01-21 09:06:03.905057017 +0000 UTC m=+162.655524334" observedRunningTime="2026-01-21 09:06:04.203133359 +0000 UTC m=+162.953600676" watchObservedRunningTime="2026-01-21 09:06:04.203841869 +0000 UTC m=+162.954309186" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.221821 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:06:04 crc kubenswrapper[4618]: W0121 09:06:04.230463 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cbe290_68b9_4de5_b082_f5426c112cae.slice/crio-ff7d951b6a9bb968277183fe81c6c982f77764c3c3245823045851d3e7eb3728 WatchSource:0}: Error finding container ff7d951b6a9bb968277183fe81c6c982f77764c3c3245823045851d3e7eb3728: Status 404 returned error can't find the container with id ff7d951b6a9bb968277183fe81c6c982f77764c3c3245823045851d3e7eb3728 Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.376898 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.385627 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d164c95c-cb58-47e7-a3a3-7e7bce8b9743-metrics-certs\") pod \"network-metrics-daemon-kpxzc\" (UID: \"d164c95c-cb58-47e7-a3a3-7e7bce8b9743\") " pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.646892 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpxzc" Jan 21 09:06:04 crc kubenswrapper[4618]: I0121 09:06:04.859310 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpxzc"] Jan 21 09:06:04 crc kubenswrapper[4618]: W0121 09:06:04.866225 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd164c95c_cb58_47e7_a3a3_7e7bce8b9743.slice/crio-7489c0f6c8fc1cb16617e2eed76bedadf7b3b3bf618d30d9ab1c1930f2ee3668 WatchSource:0}: Error finding container 7489c0f6c8fc1cb16617e2eed76bedadf7b3b3bf618d30d9ab1c1930f2ee3668: Status 404 returned error can't find the container with id 7489c0f6c8fc1cb16617e2eed76bedadf7b3b3bf618d30d9ab1c1930f2ee3668 Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.183652 4618 generic.go:334] "Generic (PLEG): container finished" podID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerID="2731be176de8b206a5f4c69d59b92d3eda352165b8913271a4da42cbd20b3e42" exitCode=0 Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.183750 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerDied","Data":"2731be176de8b206a5f4c69d59b92d3eda352165b8913271a4da42cbd20b3e42"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.187222 4618 generic.go:334] "Generic (PLEG): container finished" podID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerID="2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9" exitCode=0 Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.187281 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerDied","Data":"2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.189771 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" event={"ID":"d164c95c-cb58-47e7-a3a3-7e7bce8b9743","Type":"ContainerStarted","Data":"13d281be6ffd99fbe5179fd82337304c683236073b4777390b141ebeae01d136"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.189814 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" event={"ID":"d164c95c-cb58-47e7-a3a3-7e7bce8b9743","Type":"ContainerStarted","Data":"7489c0f6c8fc1cb16617e2eed76bedadf7b3b3bf618d30d9ab1c1930f2ee3668"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.197479 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" event={"ID":"d6cbe290-68b9-4de5-b082-f5426c112cae","Type":"ContainerStarted","Data":"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.197512 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" event={"ID":"d6cbe290-68b9-4de5-b082-f5426c112cae","Type":"ContainerStarted","Data":"ff7d951b6a9bb968277183fe81c6c982f77764c3c3245823045851d3e7eb3728"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.197672 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.203015 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerStarted","Data":"453a79e35c186154f3fb060a24b1d797a2c494f654a8abb255ddc31ea17934b7"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.204376 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.206701 4618 generic.go:334] "Generic (PLEG): container finished" podID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerID="818ccaabda4a04d7fada0b0700ceee7352238aa8ce48e36ac6c9b1fddfa4d247" exitCode=0 Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.206767 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerDied","Data":"818ccaabda4a04d7fada0b0700ceee7352238aa8ce48e36ac6c9b1fddfa4d247"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.212565 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" event={"ID":"3175ff26-a9fa-4793-89f9-776156f8fdc6","Type":"ContainerStarted","Data":"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.212594 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" event={"ID":"3175ff26-a9fa-4793-89f9-776156f8fdc6","Type":"ContainerStarted","Data":"f9d29322273deaf8bafa44afb741209ec77cd25056a368a42cf2883b5ffc0ef3"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.213107 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.218769 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.222720 4618 generic.go:334] "Generic (PLEG): container finished" podID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerID="86ebd49fb8b00ce2884395e2e24468ad725ad46a3841b29d0000918a583c9fe1" exitCode=0 Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.222770 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerDied","Data":"86ebd49fb8b00ce2884395e2e24468ad725ad46a3841b29d0000918a583c9fe1"} Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.229117 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" podStartSLOduration=7.229071958 podStartE2EDuration="7.229071958s" podCreationTimestamp="2026-01-21 09:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:05.22223356 +0000 UTC m=+163.972700878" watchObservedRunningTime="2026-01-21 09:06:05.229071958 +0000 UTC m=+163.979539275" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.256232 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" podStartSLOduration=7.256221786 podStartE2EDuration="7.256221786s" podCreationTimestamp="2026-01-21 09:05:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:05.251845491 +0000 UTC m=+164.002312807" watchObservedRunningTime="2026-01-21 09:06:05.256221786 +0000 UTC m=+164.006689102" Jan 21 09:06:05 crc kubenswrapper[4618]: I0121 09:06:05.265220 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cwzlc" podStartSLOduration=3.198394848 podStartE2EDuration="23.265202205s" podCreationTimestamp="2026-01-21 09:05:42 +0000 UTC" firstStartedPulling="2026-01-21 09:05:43.839289952 +0000 UTC m=+142.589757270" lastFinishedPulling="2026-01-21 09:06:03.906097311 +0000 UTC m=+162.656564627" observedRunningTime="2026-01-21 09:06:05.262858906 +0000 UTC m=+164.013326223" watchObservedRunningTime="2026-01-21 09:06:05.265202205 +0000 UTC m=+164.015669522" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.228740 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpxzc" event={"ID":"d164c95c-cb58-47e7-a3a3-7e7bce8b9743","Type":"ContainerStarted","Data":"0c407792177ab03c84be4c96c57025810dd771424c105c089808e160dc4c4dfb"} Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.231704 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerStarted","Data":"0800e9020961c9729539db1c6741fe9ab6f44b8c9a005bef1c58d6befea59f01"} Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.233513 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerStarted","Data":"88ee925e0a6adafa342155f3a239f9c3a4e30fb8938b28d7876c99094dcb395e"} Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.235415 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerStarted","Data":"6f9ae7fe1e02f33a55bf147ede316b8ccb4b0d18ad00ec80a56931a521a67e7c"} Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.238544 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerStarted","Data":"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da"} Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.248513 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpxzc" podStartSLOduration=144.24850263 podStartE2EDuration="2m24.24850263s" podCreationTimestamp="2026-01-21 09:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:06.245038808 +0000 UTC m=+164.995506125" watchObservedRunningTime="2026-01-21 09:06:06.24850263 +0000 UTC m=+164.998969947" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.269474 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sf7bm" podStartSLOduration=2.471031327 podStartE2EDuration="22.269450325s" podCreationTimestamp="2026-01-21 09:05:44 +0000 UTC" firstStartedPulling="2026-01-21 09:05:45.871152438 +0000 UTC m=+144.621619755" lastFinishedPulling="2026-01-21 09:06:05.669571437 +0000 UTC m=+164.420038753" observedRunningTime="2026-01-21 09:06:06.268434829 +0000 UTC m=+165.018902146" watchObservedRunningTime="2026-01-21 09:06:06.269450325 +0000 UTC m=+165.019917642" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.304887 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b8qzs" podStartSLOduration=5.757077155 podStartE2EDuration="21.304867683s" podCreationTimestamp="2026-01-21 09:05:45 +0000 UTC" firstStartedPulling="2026-01-21 09:05:50.152969629 +0000 UTC m=+148.903436946" lastFinishedPulling="2026-01-21 09:06:05.700760157 +0000 UTC m=+164.451227474" observedRunningTime="2026-01-21 09:06:06.287237506 +0000 UTC m=+165.037704823" watchObservedRunningTime="2026-01-21 09:06:06.304867683 +0000 UTC m=+165.055335000" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.305865 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8wcpt" podStartSLOduration=5.72868828 podStartE2EDuration="21.30585698s" podCreationTimestamp="2026-01-21 09:05:45 +0000 UTC" firstStartedPulling="2026-01-21 09:05:50.152996269 +0000 UTC m=+148.903463585" lastFinishedPulling="2026-01-21 09:06:05.730164967 +0000 UTC m=+164.480632285" observedRunningTime="2026-01-21 09:06:06.303078465 +0000 UTC m=+165.053545781" watchObservedRunningTime="2026-01-21 09:06:06.30585698 +0000 UTC m=+165.056324297" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.317268 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t62l5" podStartSLOduration=2.433953722 podStartE2EDuration="22.317251481s" podCreationTimestamp="2026-01-21 09:05:44 +0000 UTC" firstStartedPulling="2026-01-21 09:05:45.869022769 +0000 UTC m=+144.619490087" lastFinishedPulling="2026-01-21 09:06:05.752320529 +0000 UTC m=+164.502787846" observedRunningTime="2026-01-21 09:06:06.316584478 +0000 UTC m=+165.067051796" watchObservedRunningTime="2026-01-21 09:06:06.317251481 +0000 UTC m=+165.067718798" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.968892 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:06:06 crc kubenswrapper[4618]: I0121 09:06:06.971796 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.615206 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.615761 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.700337 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.815216 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.815261 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:06:12 crc kubenswrapper[4618]: I0121 09:06:12.847977 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.031950 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.032011 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.062054 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.231118 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.231180 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.258409 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.301649 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.301764 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.301964 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:06:13 crc kubenswrapper[4618]: I0121 09:06:13.303876 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:14 crc kubenswrapper[4618]: I0121 09:06:14.705271 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:06:14 crc kubenswrapper[4618]: I0121 09:06:14.826657 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:06:14 crc kubenswrapper[4618]: I0121 09:06:14.826700 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:06:14 crc kubenswrapper[4618]: I0121 09:06:14.858094 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.254794 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.254839 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.285337 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rvr4r" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="registry-server" containerID="cri-o://c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6" gracePeriod=2 Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.289089 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.319642 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.322207 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.322467 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gsw65" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="registry-server" containerID="cri-o://81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b" gracePeriod=2 Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.338131 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.746189 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.750493 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.913087 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.913160 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916755 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content\") pod \"4bc1d458-1c8f-4afb-b209-be769710ccf2\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916814 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vbrx\" (UniqueName: \"kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx\") pod \"4bc1d458-1c8f-4afb-b209-be769710ccf2\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916851 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njcdc\" (UniqueName: \"kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc\") pod \"71db5598-40a0-4583-88ee-7add145de7ac\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916882 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities\") pod \"4bc1d458-1c8f-4afb-b209-be769710ccf2\" (UID: \"4bc1d458-1c8f-4afb-b209-be769710ccf2\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916939 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities\") pod \"71db5598-40a0-4583-88ee-7add145de7ac\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.916982 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content\") pod \"71db5598-40a0-4583-88ee-7add145de7ac\" (UID: \"71db5598-40a0-4583-88ee-7add145de7ac\") " Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.917622 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities" (OuterVolumeSpecName: "utilities") pod "4bc1d458-1c8f-4afb-b209-be769710ccf2" (UID: "4bc1d458-1c8f-4afb-b209-be769710ccf2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.917644 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities" (OuterVolumeSpecName: "utilities") pod "71db5598-40a0-4583-88ee-7add145de7ac" (UID: "71db5598-40a0-4583-88ee-7add145de7ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.921711 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx" (OuterVolumeSpecName: "kube-api-access-7vbrx") pod "4bc1d458-1c8f-4afb-b209-be769710ccf2" (UID: "4bc1d458-1c8f-4afb-b209-be769710ccf2"). InnerVolumeSpecName "kube-api-access-7vbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.929419 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc" (OuterVolumeSpecName: "kube-api-access-njcdc") pod "71db5598-40a0-4583-88ee-7add145de7ac" (UID: "71db5598-40a0-4583-88ee-7add145de7ac"). InnerVolumeSpecName "kube-api-access-njcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.944101 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.953764 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bc1d458-1c8f-4afb-b209-be769710ccf2" (UID: "4bc1d458-1c8f-4afb-b209-be769710ccf2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:15 crc kubenswrapper[4618]: I0121 09:06:15.963759 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71db5598-40a0-4583-88ee-7add145de7ac" (UID: "71db5598-40a0-4583-88ee-7add145de7ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017767 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017792 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vbrx\" (UniqueName: \"kubernetes.io/projected/4bc1d458-1c8f-4afb-b209-be769710ccf2-kube-api-access-7vbrx\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017807 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njcdc\" (UniqueName: \"kubernetes.io/projected/71db5598-40a0-4583-88ee-7add145de7ac-kube-api-access-njcdc\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017817 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc1d458-1c8f-4afb-b209-be769710ccf2-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017824 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.017832 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71db5598-40a0-4583-88ee-7add145de7ac-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.226885 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.226938 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.255038 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.291051 4618 generic.go:334] "Generic (PLEG): container finished" podID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerID="81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b" exitCode=0 Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.291110 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerDied","Data":"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b"} Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.291132 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsw65" event={"ID":"4bc1d458-1c8f-4afb-b209-be769710ccf2","Type":"ContainerDied","Data":"33aa2592641ea139182a8214db880ad7647339b80570383cde43d43908eae4a9"} Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.291166 4618 scope.go:117] "RemoveContainer" containerID="81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.291184 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsw65" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.293240 4618 generic.go:334] "Generic (PLEG): container finished" podID="71db5598-40a0-4583-88ee-7add145de7ac" containerID="c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6" exitCode=0 Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.293319 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvr4r" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.293375 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerDied","Data":"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6"} Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.293412 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvr4r" event={"ID":"71db5598-40a0-4583-88ee-7add145de7ac","Type":"ContainerDied","Data":"ac1d620f2649ffba1e0b416cdc74297c05e43129371c21c11bd8eacb269ca149"} Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.304676 4618 scope.go:117] "RemoveContainer" containerID="4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.313564 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.319862 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gsw65"] Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.324844 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.324902 4618 scope.go:117] "RemoveContainer" containerID="3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.335641 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.338365 4618 scope.go:117] "RemoveContainer" containerID="81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.338627 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b\": container with ID starting with 81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b not found: ID does not exist" containerID="81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.338656 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b"} err="failed to get container status \"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b\": rpc error: code = NotFound desc = could not find container \"81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b\": container with ID starting with 81ebab8495f51d52e73f6686fc13f1dd4f1d1e281673f6276be8009c607ec57b not found: ID does not exist" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.338689 4618 scope.go:117] "RemoveContainer" containerID="4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.338866 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.338997 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d\": container with ID starting with 4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d not found: ID does not exist" containerID="4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.339013 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d"} err="failed to get container status \"4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d\": rpc error: code = NotFound desc = could not find container \"4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d\": container with ID starting with 4b4fd0f0ec5fed67818126f01dc5c5a3bfd1abe61a8f05dcb37622c48d8ec31d not found: ID does not exist" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.339026 4618 scope.go:117] "RemoveContainer" containerID="3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.339217 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9\": container with ID starting with 3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9 not found: ID does not exist" containerID="3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.339236 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9"} err="failed to get container status \"3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9\": rpc error: code = NotFound desc = could not find container \"3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9\": container with ID starting with 3b4a7eeac172213448d337766facfa214d60dba51dc04125c6dd3d873f36c3a9 not found: ID does not exist" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.339248 4618 scope.go:117] "RemoveContainer" containerID="c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.339967 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rvr4r"] Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.358269 4618 scope.go:117] "RemoveContainer" containerID="8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.373748 4618 scope.go:117] "RemoveContainer" containerID="1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.387274 4618 scope.go:117] "RemoveContainer" containerID="c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.387589 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6\": container with ID starting with c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6 not found: ID does not exist" containerID="c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.387615 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6"} err="failed to get container status \"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6\": rpc error: code = NotFound desc = could not find container \"c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6\": container with ID starting with c14503dd37a7f0179f48c77b65a15008b3dfe723e37f55be2c23cf7352dc40d6 not found: ID does not exist" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.387635 4618 scope.go:117] "RemoveContainer" containerID="8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.387969 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f\": container with ID starting with 8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f not found: ID does not exist" containerID="8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.387998 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f"} err="failed to get container status \"8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f\": rpc error: code = NotFound desc = could not find container \"8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f\": container with ID starting with 8621bd0abb731df1f0df8e6ccca3d7d1cd15341426616484e4a2bfa727c5368f not found: ID does not exist" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.388017 4618 scope.go:117] "RemoveContainer" containerID="1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228" Jan 21 09:06:16 crc kubenswrapper[4618]: E0121 09:06:16.388571 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228\": container with ID starting with 1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228 not found: ID does not exist" containerID="1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228" Jan 21 09:06:16 crc kubenswrapper[4618]: I0121 09:06:16.388596 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228"} err="failed to get container status \"1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228\": rpc error: code = NotFound desc = could not find container \"1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228\": container with ID starting with 1af403fee169e4a7b4271baefd3833b7b6ada0ef933d0f835b768b393ff53228 not found: ID does not exist" Jan 21 09:06:17 crc kubenswrapper[4618]: I0121 09:06:17.548573 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" path="/var/lib/kubelet/pods/4bc1d458-1c8f-4afb-b209-be769710ccf2/volumes" Jan 21 09:06:17 crc kubenswrapper[4618]: I0121 09:06:17.549430 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71db5598-40a0-4583-88ee-7add145de7ac" path="/var/lib/kubelet/pods/71db5598-40a0-4583-88ee-7add145de7ac/volumes" Jan 21 09:06:17 crc kubenswrapper[4618]: I0121 09:06:17.708336 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:06:17 crc kubenswrapper[4618]: I0121 09:06:17.709481 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sf7bm" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="registry-server" containerID="cri-o://898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da" gracePeriod=2 Jan 21 09:06:17 crc kubenswrapper[4618]: I0121 09:06:17.939532 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q99bm" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.055499 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.139574 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities\") pod \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.139835 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content\") pod \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.140188 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities" (OuterVolumeSpecName: "utilities") pod "792d0f04-c700-4062-9eb3-a98b0d5e41d9" (UID: "792d0f04-c700-4062-9eb3-a98b0d5e41d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.156490 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "792d0f04-c700-4062-9eb3-a98b0d5e41d9" (UID: "792d0f04-c700-4062-9eb3-a98b0d5e41d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.240364 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7hb9\" (UniqueName: \"kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9\") pod \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\" (UID: \"792d0f04-c700-4062-9eb3-a98b0d5e41d9\") " Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.240507 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.240525 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792d0f04-c700-4062-9eb3-a98b0d5e41d9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.245303 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9" (OuterVolumeSpecName: "kube-api-access-x7hb9") pod "792d0f04-c700-4062-9eb3-a98b0d5e41d9" (UID: "792d0f04-c700-4062-9eb3-a98b0d5e41d9"). InnerVolumeSpecName "kube-api-access-x7hb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.307008 4618 generic.go:334] "Generic (PLEG): container finished" podID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerID="898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da" exitCode=0 Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.307063 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf7bm" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.307110 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerDied","Data":"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da"} Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.307135 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf7bm" event={"ID":"792d0f04-c700-4062-9eb3-a98b0d5e41d9","Type":"ContainerDied","Data":"005998c143a4fd18ada78565cb3ea195b25e6de679c3b6e82468d80c28723006"} Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.307173 4618 scope.go:117] "RemoveContainer" containerID="898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.325198 4618 scope.go:117] "RemoveContainer" containerID="2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.327035 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.331997 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf7bm"] Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.341864 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7hb9\" (UniqueName: \"kubernetes.io/projected/792d0f04-c700-4062-9eb3-a98b0d5e41d9-kube-api-access-x7hb9\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.346750 4618 scope.go:117] "RemoveContainer" containerID="8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.360729 4618 scope.go:117] "RemoveContainer" containerID="898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da" Jan 21 09:06:18 crc kubenswrapper[4618]: E0121 09:06:18.361034 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da\": container with ID starting with 898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da not found: ID does not exist" containerID="898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.361079 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da"} err="failed to get container status \"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da\": rpc error: code = NotFound desc = could not find container \"898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da\": container with ID starting with 898dde4989a9ba5b91de4bce9e290a9dc9de24d5a00039e64d31b1ebdb6de1da not found: ID does not exist" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.361096 4618 scope.go:117] "RemoveContainer" containerID="2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9" Jan 21 09:06:18 crc kubenswrapper[4618]: E0121 09:06:18.361549 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9\": container with ID starting with 2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9 not found: ID does not exist" containerID="2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.361580 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9"} err="failed to get container status \"2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9\": rpc error: code = NotFound desc = could not find container \"2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9\": container with ID starting with 2fc60214c03fffd30f39b4a77df48bd75d40daee99eba390c9fee6dbeb4826e9 not found: ID does not exist" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.361603 4618 scope.go:117] "RemoveContainer" containerID="8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d" Jan 21 09:06:18 crc kubenswrapper[4618]: E0121 09:06:18.361876 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d\": container with ID starting with 8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d not found: ID does not exist" containerID="8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.361932 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d"} err="failed to get container status \"8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d\": rpc error: code = NotFound desc = could not find container \"8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d\": container with ID starting with 8e5f856deef3541a901f58fc62e6e515014c3408914fecc7f58bc24e58d43b9d not found: ID does not exist" Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.767426 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.767805 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" podUID="3175ff26-a9fa-4793-89f9-776156f8fdc6" containerName="controller-manager" containerID="cri-o://4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752" gracePeriod=30 Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.861031 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:06:18 crc kubenswrapper[4618]: I0121 09:06:18.861280 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" podUID="d6cbe290-68b9-4de5-b082-f5426c112cae" containerName="route-controller-manager" containerID="cri-o://2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a" gracePeriod=30 Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.206530 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.218074 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.313712 4618 generic.go:334] "Generic (PLEG): container finished" podID="d6cbe290-68b9-4de5-b082-f5426c112cae" containerID="2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a" exitCode=0 Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.313781 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" event={"ID":"d6cbe290-68b9-4de5-b082-f5426c112cae","Type":"ContainerDied","Data":"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a"} Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.313808 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" event={"ID":"d6cbe290-68b9-4de5-b082-f5426c112cae","Type":"ContainerDied","Data":"ff7d951b6a9bb968277183fe81c6c982f77764c3c3245823045851d3e7eb3728"} Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.313826 4618 scope.go:117] "RemoveContainer" containerID="2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.313915 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.317065 4618 generic.go:334] "Generic (PLEG): container finished" podID="3175ff26-a9fa-4793-89f9-776156f8fdc6" containerID="4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752" exitCode=0 Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.317091 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" event={"ID":"3175ff26-a9fa-4793-89f9-776156f8fdc6","Type":"ContainerDied","Data":"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752"} Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.317107 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" event={"ID":"3175ff26-a9fa-4793-89f9-776156f8fdc6","Type":"ContainerDied","Data":"f9d29322273deaf8bafa44afb741209ec77cd25056a368a42cf2883b5ffc0ef3"} Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.317173 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5788dfcd64-ts25l" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.329378 4618 scope.go:117] "RemoveContainer" containerID="2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.329830 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a\": container with ID starting with 2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a not found: ID does not exist" containerID="2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.329895 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a"} err="failed to get container status \"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a\": rpc error: code = NotFound desc = could not find container \"2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a\": container with ID starting with 2621c941cf241cafc2e5c5fd59d187a9e0eda4a103ae1f2bac6265e4c28cd87a not found: ID does not exist" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.329954 4618 scope.go:117] "RemoveContainer" containerID="4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.345766 4618 scope.go:117] "RemoveContainer" containerID="4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.346282 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752\": container with ID starting with 4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752 not found: ID does not exist" containerID="4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.346315 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752"} err="failed to get container status \"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752\": rpc error: code = NotFound desc = could not find container \"4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752\": container with ID starting with 4dd44fe71d7901ee11785110f8be32f799f9bfd6016bbe401e76e94fe677a752 not found: ID does not exist" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353575 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbnmc\" (UniqueName: \"kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc\") pod \"3175ff26-a9fa-4793-89f9-776156f8fdc6\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353655 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca\") pod \"3175ff26-a9fa-4793-89f9-776156f8fdc6\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353727 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert\") pod \"3175ff26-a9fa-4793-89f9-776156f8fdc6\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353785 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles\") pod \"3175ff26-a9fa-4793-89f9-776156f8fdc6\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353806 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config\") pod \"d6cbe290-68b9-4de5-b082-f5426c112cae\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353859 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert\") pod \"d6cbe290-68b9-4de5-b082-f5426c112cae\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353883 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca\") pod \"d6cbe290-68b9-4de5-b082-f5426c112cae\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353918 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfj75\" (UniqueName: \"kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75\") pod \"d6cbe290-68b9-4de5-b082-f5426c112cae\" (UID: \"d6cbe290-68b9-4de5-b082-f5426c112cae\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.353950 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config\") pod \"3175ff26-a9fa-4793-89f9-776156f8fdc6\" (UID: \"3175ff26-a9fa-4793-89f9-776156f8fdc6\") " Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.354345 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca" (OuterVolumeSpecName: "client-ca") pod "3175ff26-a9fa-4793-89f9-776156f8fdc6" (UID: "3175ff26-a9fa-4793-89f9-776156f8fdc6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.354739 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config" (OuterVolumeSpecName: "config") pod "d6cbe290-68b9-4de5-b082-f5426c112cae" (UID: "d6cbe290-68b9-4de5-b082-f5426c112cae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.354781 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6cbe290-68b9-4de5-b082-f5426c112cae" (UID: "d6cbe290-68b9-4de5-b082-f5426c112cae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.354804 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3175ff26-a9fa-4793-89f9-776156f8fdc6" (UID: "3175ff26-a9fa-4793-89f9-776156f8fdc6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.354982 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config" (OuterVolumeSpecName: "config") pod "3175ff26-a9fa-4793-89f9-776156f8fdc6" (UID: "3175ff26-a9fa-4793-89f9-776156f8fdc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.357831 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc" (OuterVolumeSpecName: "kube-api-access-bbnmc") pod "3175ff26-a9fa-4793-89f9-776156f8fdc6" (UID: "3175ff26-a9fa-4793-89f9-776156f8fdc6"). InnerVolumeSpecName "kube-api-access-bbnmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.357869 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3175ff26-a9fa-4793-89f9-776156f8fdc6" (UID: "3175ff26-a9fa-4793-89f9-776156f8fdc6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.357911 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75" (OuterVolumeSpecName: "kube-api-access-jfj75") pod "d6cbe290-68b9-4de5-b082-f5426c112cae" (UID: "d6cbe290-68b9-4de5-b082-f5426c112cae"). InnerVolumeSpecName "kube-api-access-jfj75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.358235 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6cbe290-68b9-4de5-b082-f5426c112cae" (UID: "d6cbe290-68b9-4de5-b082-f5426c112cae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456429 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3175ff26-a9fa-4793-89f9-776156f8fdc6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456554 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456575 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456587 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cbe290-68b9-4de5-b082-f5426c112cae-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456600 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6cbe290-68b9-4de5-b082-f5426c112cae-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456612 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfj75\" (UniqueName: \"kubernetes.io/projected/d6cbe290-68b9-4de5-b082-f5426c112cae-kube-api-access-jfj75\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456628 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456640 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbnmc\" (UniqueName: \"kubernetes.io/projected/3175ff26-a9fa-4793-89f9-776156f8fdc6-kube-api-access-bbnmc\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.456650 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3175ff26-a9fa-4793-89f9-776156f8fdc6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.543396 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" path="/var/lib/kubelet/pods/792d0f04-c700-4062-9eb3-a98b0d5e41d9/volumes" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.635764 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.639768 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d47948cf9-k4pg7"] Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.641845 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.643714 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5788dfcd64-ts25l"] Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994520 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994826 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994838 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994846 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994853 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994859 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cbe290-68b9-4de5-b082-f5426c112cae" containerName="route-controller-manager" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994865 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cbe290-68b9-4de5-b082-f5426c112cae" containerName="route-controller-manager" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994875 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3175ff26-a9fa-4793-89f9-776156f8fdc6" containerName="controller-manager" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994881 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3175ff26-a9fa-4793-89f9-776156f8fdc6" containerName="controller-manager" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994889 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994895 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994918 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994923 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994932 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994939 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="extract-utilities" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994948 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994955 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="extract-content" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994961 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994966 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994978 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994983 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: E0121 09:06:19.994992 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.994996 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.998207 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="792d0f04-c700-4062-9eb3-a98b0d5e41d9" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.998281 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="71db5598-40a0-4583-88ee-7add145de7ac" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.998301 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc1d458-1c8f-4afb-b209-be769710ccf2" containerName="registry-server" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.998327 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cbe290-68b9-4de5-b082-f5426c112cae" containerName="route-controller-manager" Jan 21 09:06:19 crc kubenswrapper[4618]: I0121 09:06:19.998341 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3175ff26-a9fa-4793-89f9-776156f8fdc6" containerName="controller-manager" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.000581 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.005339 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.007777 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.007860 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.007921 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.008000 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.008680 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.009076 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.009228 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.013947 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.014494 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.014501 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.015078 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.015317 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.015630 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.015701 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.023586 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.025424 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062408 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062452 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062473 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzgv\" (UniqueName: \"kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062495 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062542 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062569 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062634 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062683 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktd7\" (UniqueName: \"kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.062732 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.107776 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.108117 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b8qzs" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="registry-server" containerID="cri-o://0800e9020961c9729539db1c6741fe9ab6f44b8c9a005bef1c58d6befea59f01" gracePeriod=2 Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164072 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164124 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164172 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktd7\" (UniqueName: \"kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164210 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164269 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164298 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164324 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzgv\" (UniqueName: \"kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164347 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.164370 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.165617 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.165964 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.166030 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.166309 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.166517 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.168710 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.168766 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.178434 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktd7\" (UniqueName: \"kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7\") pod \"route-controller-manager-5675bbd895-h2wxq\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.178467 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzgv\" (UniqueName: \"kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv\") pod \"controller-manager-55545c7bd-gh66w\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.317820 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.324605 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.331452 4618 generic.go:334] "Generic (PLEG): container finished" podID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerID="0800e9020961c9729539db1c6741fe9ab6f44b8c9a005bef1c58d6befea59f01" exitCode=0 Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.331488 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerDied","Data":"0800e9020961c9729539db1c6741fe9ab6f44b8c9a005bef1c58d6befea59f01"} Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.422830 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.572296 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities\") pod \"30a7821a-79fa-4a74-8cc9-220ddc395bba\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.572938 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities" (OuterVolumeSpecName: "utilities") pod "30a7821a-79fa-4a74-8cc9-220ddc395bba" (UID: "30a7821a-79fa-4a74-8cc9-220ddc395bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.573114 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqx4h\" (UniqueName: \"kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h\") pod \"30a7821a-79fa-4a74-8cc9-220ddc395bba\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.573510 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content\") pod \"30a7821a-79fa-4a74-8cc9-220ddc395bba\" (UID: \"30a7821a-79fa-4a74-8cc9-220ddc395bba\") " Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.573732 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.576226 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h" (OuterVolumeSpecName: "kube-api-access-gqx4h") pod "30a7821a-79fa-4a74-8cc9-220ddc395bba" (UID: "30a7821a-79fa-4a74-8cc9-220ddc395bba"). InnerVolumeSpecName "kube-api-access-gqx4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.663961 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30a7821a-79fa-4a74-8cc9-220ddc395bba" (UID: "30a7821a-79fa-4a74-8cc9-220ddc395bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.675699 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqx4h\" (UniqueName: \"kubernetes.io/projected/30a7821a-79fa-4a74-8cc9-220ddc395bba-kube-api-access-gqx4h\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.675738 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a7821a-79fa-4a74-8cc9-220ddc395bba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.696958 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:20 crc kubenswrapper[4618]: W0121 09:06:20.701548 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00b474e_f54f_42e0_83a5_8dd131bfa5a3.slice/crio-923e5094423f3346226b4ea3c487a5feaf87bde17a5c39329fd203e62417347b WatchSource:0}: Error finding container 923e5094423f3346226b4ea3c487a5feaf87bde17a5c39329fd203e62417347b: Status 404 returned error can't find the container with id 923e5094423f3346226b4ea3c487a5feaf87bde17a5c39329fd203e62417347b Jan 21 09:06:20 crc kubenswrapper[4618]: I0121 09:06:20.728345 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:20 crc kubenswrapper[4618]: W0121 09:06:20.732112 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1e328f_bbe6_4a5e_a264_2d70e23b0e85.slice/crio-7004f8cba6354bb7c80ec89f92ea33887d069439d12bed0c3dda09100eb7459b WatchSource:0}: Error finding container 7004f8cba6354bb7c80ec89f92ea33887d069439d12bed0c3dda09100eb7459b: Status 404 returned error can't find the container with id 7004f8cba6354bb7c80ec89f92ea33887d069439d12bed0c3dda09100eb7459b Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.338582 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8qzs" event={"ID":"30a7821a-79fa-4a74-8cc9-220ddc395bba","Type":"ContainerDied","Data":"2daac383ceae2fbabbc63e493cf3ee285b4ddee68a810b5ed39fdb90959cb470"} Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.338623 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8qzs" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.338639 4618 scope.go:117] "RemoveContainer" containerID="0800e9020961c9729539db1c6741fe9ab6f44b8c9a005bef1c58d6befea59f01" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.341086 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" event={"ID":"d00b474e-f54f-42e0-83a5-8dd131bfa5a3","Type":"ContainerStarted","Data":"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f"} Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.341161 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" event={"ID":"d00b474e-f54f-42e0-83a5-8dd131bfa5a3","Type":"ContainerStarted","Data":"923e5094423f3346226b4ea3c487a5feaf87bde17a5c39329fd203e62417347b"} Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.341224 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.343170 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" event={"ID":"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85","Type":"ContainerStarted","Data":"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff"} Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.343199 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" event={"ID":"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85","Type":"ContainerStarted","Data":"7004f8cba6354bb7c80ec89f92ea33887d069439d12bed0c3dda09100eb7459b"} Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.343376 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.346123 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.348557 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.355287 4618 scope.go:117] "RemoveContainer" containerID="818ccaabda4a04d7fada0b0700ceee7352238aa8ce48e36ac6c9b1fddfa4d247" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.356830 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" podStartSLOduration=3.356820688 podStartE2EDuration="3.356820688s" podCreationTimestamp="2026-01-21 09:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:21.354557198 +0000 UTC m=+180.105024515" watchObservedRunningTime="2026-01-21 09:06:21.356820688 +0000 UTC m=+180.107288004" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.386625 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" podStartSLOduration=3.386613497 podStartE2EDuration="3.386613497s" podCreationTimestamp="2026-01-21 09:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:21.386474425 +0000 UTC m=+180.136941742" watchObservedRunningTime="2026-01-21 09:06:21.386613497 +0000 UTC m=+180.137080814" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.398789 4618 scope.go:117] "RemoveContainer" containerID="ecdbf89c996a4008bdc0711f7fbc98d3aa44dfc9c2515509c93569c5ee0f459c" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.417235 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.421028 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b8qzs"] Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.552423 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" path="/var/lib/kubelet/pods/30a7821a-79fa-4a74-8cc9-220ddc395bba/volumes" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.553430 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3175ff26-a9fa-4793-89f9-776156f8fdc6" path="/var/lib/kubelet/pods/3175ff26-a9fa-4793-89f9-776156f8fdc6/volumes" Jan 21 09:06:21 crc kubenswrapper[4618]: I0121 09:06:21.553872 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cbe290-68b9-4de5-b082-f5426c112cae" path="/var/lib/kubelet/pods/d6cbe290-68b9-4de5-b082-f5426c112cae/volumes" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.090881 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 09:06:25 crc kubenswrapper[4618]: E0121 09:06:25.091291 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="extract-utilities" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.091302 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="extract-utilities" Jan 21 09:06:25 crc kubenswrapper[4618]: E0121 09:06:25.091310 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="extract-content" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.091315 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="extract-content" Jan 21 09:06:25 crc kubenswrapper[4618]: E0121 09:06:25.091326 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="registry-server" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.091331 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="registry-server" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.091418 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a7821a-79fa-4a74-8cc9-220ddc395bba" containerName="registry-server" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.091718 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.093202 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.093371 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.097601 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.122494 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.122572 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.223731 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.223807 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.223831 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.239688 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.403863 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.692612 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:06:25 crc kubenswrapper[4618]: I0121 09:06:25.764899 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 09:06:25 crc kubenswrapper[4618]: W0121 09:06:25.769150 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb4d99430_a717_456e_a731_ead00be5dce4.slice/crio-02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89 WatchSource:0}: Error finding container 02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89: Status 404 returned error can't find the container with id 02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89 Jan 21 09:06:26 crc kubenswrapper[4618]: I0121 09:06:26.372521 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b4d99430-a717-456e-a731-ead00be5dce4","Type":"ContainerStarted","Data":"eda4bb199dfaf83716e953aba1d4f1cc97e1e5e3bbececb9a8fcf7b69f62e29c"} Jan 21 09:06:26 crc kubenswrapper[4618]: I0121 09:06:26.372834 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b4d99430-a717-456e-a731-ead00be5dce4","Type":"ContainerStarted","Data":"02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89"} Jan 21 09:06:26 crc kubenswrapper[4618]: I0121 09:06:26.383288 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.383275508 podStartE2EDuration="1.383275508s" podCreationTimestamp="2026-01-21 09:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:26.382778055 +0000 UTC m=+185.133245362" watchObservedRunningTime="2026-01-21 09:06:26.383275508 +0000 UTC m=+185.133742826" Jan 21 09:06:26 crc kubenswrapper[4618]: I0121 09:06:26.959219 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:06:26 crc kubenswrapper[4618]: I0121 09:06:26.959486 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:06:27 crc kubenswrapper[4618]: I0121 09:06:27.377383 4618 generic.go:334] "Generic (PLEG): container finished" podID="b4d99430-a717-456e-a731-ead00be5dce4" containerID="eda4bb199dfaf83716e953aba1d4f1cc97e1e5e3bbececb9a8fcf7b69f62e29c" exitCode=0 Jan 21 09:06:27 crc kubenswrapper[4618]: I0121 09:06:27.377420 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b4d99430-a717-456e-a731-ead00be5dce4","Type":"ContainerDied","Data":"eda4bb199dfaf83716e953aba1d4f1cc97e1e5e3bbececb9a8fcf7b69f62e29c"} Jan 21 09:06:27 crc kubenswrapper[4618]: I0121 09:06:27.656458 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.619389 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.768567 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access\") pod \"b4d99430-a717-456e-a731-ead00be5dce4\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.768600 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir\") pod \"b4d99430-a717-456e-a731-ead00be5dce4\" (UID: \"b4d99430-a717-456e-a731-ead00be5dce4\") " Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.768766 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b4d99430-a717-456e-a731-ead00be5dce4" (UID: "b4d99430-a717-456e-a731-ead00be5dce4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.772906 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b4d99430-a717-456e-a731-ead00be5dce4" (UID: "b4d99430-a717-456e-a731-ead00be5dce4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.870319 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4d99430-a717-456e-a731-ead00be5dce4-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:28 crc kubenswrapper[4618]: I0121 09:06:28.870346 4618 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4d99430-a717-456e-a731-ead00be5dce4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.387277 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b4d99430-a717-456e-a731-ead00be5dce4","Type":"ContainerDied","Data":"02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89"} Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.387313 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02be03f6adb109705adcdd97f1ee2e4ae1af7d1ac2c7c640c0137569d4805f89" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.387324 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.688871 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 09:06:29 crc kubenswrapper[4618]: E0121 09:06:29.689048 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d99430-a717-456e-a731-ead00be5dce4" containerName="pruner" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.689060 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d99430-a717-456e-a731-ead00be5dce4" containerName="pruner" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.689160 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d99430-a717-456e-a731-ead00be5dce4" containerName="pruner" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.689454 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.691944 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.700414 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.701048 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.778101 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.778155 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.778188 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.878932 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.878973 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.879028 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.879063 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.879074 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:29 crc kubenswrapper[4618]: I0121 09:06:29.892079 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access\") pod \"installer-9-crc\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:30 crc kubenswrapper[4618]: I0121 09:06:30.000817 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:06:30 crc kubenswrapper[4618]: I0121 09:06:30.331075 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 09:06:30 crc kubenswrapper[4618]: I0121 09:06:30.391497 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71cc31c5-e39a-4571-b350-c9532b7752de","Type":"ContainerStarted","Data":"9a3cdf46d05c90f6600359d116a3a40a9b3b41b05746473bb77e13e6d68d6ece"} Jan 21 09:06:31 crc kubenswrapper[4618]: I0121 09:06:31.397420 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71cc31c5-e39a-4571-b350-c9532b7752de","Type":"ContainerStarted","Data":"e938c79fcaa724f998f9a632314e3929d9c83d29d82aaa7cbd01d4327eb3c5ab"} Jan 21 09:06:31 crc kubenswrapper[4618]: I0121 09:06:31.412089 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.412062446 podStartE2EDuration="2.412062446s" podCreationTimestamp="2026-01-21 09:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:31.409200363 +0000 UTC m=+190.159667681" watchObservedRunningTime="2026-01-21 09:06:31.412062446 +0000 UTC m=+190.162529764" Jan 21 09:06:38 crc kubenswrapper[4618]: I0121 09:06:38.737979 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:38 crc kubenswrapper[4618]: I0121 09:06:38.738419 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" podUID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" containerName="controller-manager" containerID="cri-o://d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f" gracePeriod=30 Jan 21 09:06:38 crc kubenswrapper[4618]: I0121 09:06:38.745321 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:38 crc kubenswrapper[4618]: I0121 09:06:38.745524 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" podUID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" containerName="route-controller-manager" containerID="cri-o://9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff" gracePeriod=30 Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.169668 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174055 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca\") pod \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174088 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert\") pod \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174108 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bktd7\" (UniqueName: \"kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7\") pod \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174133 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config\") pod \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\" (UID: \"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174655 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" (UID: "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.174752 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config" (OuterVolumeSpecName: "config") pod "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" (UID: "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.178189 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" (UID: "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.178339 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7" (OuterVolumeSpecName: "kube-api-access-bktd7") pod "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" (UID: "5d1e328f-bbe6-4a5e-a264-2d70e23b0e85"). InnerVolumeSpecName "kube-api-access-bktd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.240363 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.275757 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.275791 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.275801 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bktd7\" (UniqueName: \"kubernetes.io/projected/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-kube-api-access-bktd7\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.275813 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.376576 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert\") pod \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.376621 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles\") pod \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.376646 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca\") pod \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.376684 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config\") pod \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.376730 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nzgv\" (UniqueName: \"kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv\") pod \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\" (UID: \"d00b474e-f54f-42e0-83a5-8dd131bfa5a3\") " Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.377381 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d00b474e-f54f-42e0-83a5-8dd131bfa5a3" (UID: "d00b474e-f54f-42e0-83a5-8dd131bfa5a3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.377409 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "d00b474e-f54f-42e0-83a5-8dd131bfa5a3" (UID: "d00b474e-f54f-42e0-83a5-8dd131bfa5a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.377498 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config" (OuterVolumeSpecName: "config") pod "d00b474e-f54f-42e0-83a5-8dd131bfa5a3" (UID: "d00b474e-f54f-42e0-83a5-8dd131bfa5a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.379131 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d00b474e-f54f-42e0-83a5-8dd131bfa5a3" (UID: "d00b474e-f54f-42e0-83a5-8dd131bfa5a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.379176 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv" (OuterVolumeSpecName: "kube-api-access-8nzgv") pod "d00b474e-f54f-42e0-83a5-8dd131bfa5a3" (UID: "d00b474e-f54f-42e0-83a5-8dd131bfa5a3"). InnerVolumeSpecName "kube-api-access-8nzgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.429491 4618 generic.go:334] "Generic (PLEG): container finished" podID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" containerID="9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff" exitCode=0 Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.429529 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" event={"ID":"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85","Type":"ContainerDied","Data":"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff"} Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.429577 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" event={"ID":"5d1e328f-bbe6-4a5e-a264-2d70e23b0e85","Type":"ContainerDied","Data":"7004f8cba6354bb7c80ec89f92ea33887d069439d12bed0c3dda09100eb7459b"} Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.429596 4618 scope.go:117] "RemoveContainer" containerID="9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.429798 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.430870 4618 generic.go:334] "Generic (PLEG): container finished" podID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" containerID="d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f" exitCode=0 Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.430902 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" event={"ID":"d00b474e-f54f-42e0-83a5-8dd131bfa5a3","Type":"ContainerDied","Data":"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f"} Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.430923 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" event={"ID":"d00b474e-f54f-42e0-83a5-8dd131bfa5a3","Type":"ContainerDied","Data":"923e5094423f3346226b4ea3c487a5feaf87bde17a5c39329fd203e62417347b"} Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.430939 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55545c7bd-gh66w" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.443285 4618 scope.go:117] "RemoveContainer" containerID="9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff" Jan 21 09:06:39 crc kubenswrapper[4618]: E0121 09:06:39.443731 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff\": container with ID starting with 9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff not found: ID does not exist" containerID="9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.443763 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff"} err="failed to get container status \"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff\": rpc error: code = NotFound desc = could not find container \"9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff\": container with ID starting with 9d5f8cc45ff9e440ef2ba6c1047b7f305adfda176a8df294133c7ad5c95656ff not found: ID does not exist" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.443786 4618 scope.go:117] "RemoveContainer" containerID="d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.453846 4618 scope.go:117] "RemoveContainer" containerID="d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.453975 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.455817 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5675bbd895-h2wxq"] Jan 21 09:06:39 crc kubenswrapper[4618]: E0121 09:06:39.456211 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f\": container with ID starting with d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f not found: ID does not exist" containerID="d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.456257 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f"} err="failed to get container status \"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f\": rpc error: code = NotFound desc = could not find container \"d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f\": container with ID starting with d4dbf3c184c976d3c615ab03492a57330f6990862a7315c1e2feb333e393550f not found: ID does not exist" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.461597 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.463921 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55545c7bd-gh66w"] Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.478334 4618 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.478357 4618 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.478368 4618 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.478376 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.478387 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nzgv\" (UniqueName: \"kubernetes.io/projected/d00b474e-f54f-42e0-83a5-8dd131bfa5a3-kube-api-access-8nzgv\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.546534 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" path="/var/lib/kubelet/pods/5d1e328f-bbe6-4a5e-a264-2d70e23b0e85/volumes" Jan 21 09:06:39 crc kubenswrapper[4618]: I0121 09:06:39.547014 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" path="/var/lib/kubelet/pods/d00b474e-f54f-42e0-83a5-8dd131bfa5a3/volumes" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.005358 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf"] Jan 21 09:06:40 crc kubenswrapper[4618]: E0121 09:06:40.005630 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" containerName="controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.005644 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" containerName="controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: E0121 09:06:40.005663 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" containerName="route-controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.005670 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" containerName="route-controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.005760 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1e328f-bbe6-4a5e-a264-2d70e23b0e85" containerName="route-controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.005778 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00b474e-f54f-42e0-83a5-8dd131bfa5a3" containerName="controller-manager" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.006231 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.007641 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv"] Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.007714 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.008237 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.008272 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.008382 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.008762 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.009619 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.010187 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.010933 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.010959 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.011079 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.011115 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.013477 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.013739 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.018664 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv"] Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.021294 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf"] Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.021347 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085000 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-config\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc6m\" (UniqueName: \"kubernetes.io/projected/dac73184-1338-4c52-9db2-16cf72732c5d-kube-api-access-4tc6m\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085069 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-client-ca\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085101 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac73184-1338-4c52-9db2-16cf72732c5d-serving-cert\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085181 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994a125d-9d16-4821-9ab8-9575cb5edab8-serving-cert\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085221 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-config\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085277 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5n7b\" (UniqueName: \"kubernetes.io/projected/994a125d-9d16-4821-9ab8-9575cb5edab8-kube-api-access-b5n7b\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085356 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-proxy-ca-bundles\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.085416 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-client-ca\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186400 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-proxy-ca-bundles\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186464 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-client-ca\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186505 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-config\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186525 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc6m\" (UniqueName: \"kubernetes.io/projected/dac73184-1338-4c52-9db2-16cf72732c5d-kube-api-access-4tc6m\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186550 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-client-ca\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186571 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac73184-1338-4c52-9db2-16cf72732c5d-serving-cert\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186591 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994a125d-9d16-4821-9ab8-9575cb5edab8-serving-cert\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186619 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-config\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.186649 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5n7b\" (UniqueName: \"kubernetes.io/projected/994a125d-9d16-4821-9ab8-9575cb5edab8-kube-api-access-b5n7b\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.187288 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-client-ca\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.187745 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-client-ca\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.188048 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994a125d-9d16-4821-9ab8-9575cb5edab8-config\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.188291 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-proxy-ca-bundles\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.188576 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dac73184-1338-4c52-9db2-16cf72732c5d-config\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.189775 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/994a125d-9d16-4821-9ab8-9575cb5edab8-serving-cert\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.192899 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac73184-1338-4c52-9db2-16cf72732c5d-serving-cert\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.199411 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5n7b\" (UniqueName: \"kubernetes.io/projected/994a125d-9d16-4821-9ab8-9575cb5edab8-kube-api-access-b5n7b\") pod \"route-controller-manager-c96ff9756-ggxjv\" (UID: \"994a125d-9d16-4821-9ab8-9575cb5edab8\") " pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.199943 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc6m\" (UniqueName: \"kubernetes.io/projected/dac73184-1338-4c52-9db2-16cf72732c5d-kube-api-access-4tc6m\") pod \"controller-manager-848b6b6c7f-tn4sf\" (UID: \"dac73184-1338-4c52-9db2-16cf72732c5d\") " pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.318324 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.323944 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.665837 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf"] Jan 21 09:06:40 crc kubenswrapper[4618]: I0121 09:06:40.703445 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv"] Jan 21 09:06:40 crc kubenswrapper[4618]: W0121 09:06:40.715137 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994a125d_9d16_4821_9ab8_9575cb5edab8.slice/crio-15ec6df3d3b7008a87736e7270574c69807e5a72a991bf40bbc6f72e24d9f008 WatchSource:0}: Error finding container 15ec6df3d3b7008a87736e7270574c69807e5a72a991bf40bbc6f72e24d9f008: Status 404 returned error can't find the container with id 15ec6df3d3b7008a87736e7270574c69807e5a72a991bf40bbc6f72e24d9f008 Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.443100 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" event={"ID":"994a125d-9d16-4821-9ab8-9575cb5edab8","Type":"ContainerStarted","Data":"e20bacad4495662dd60bf3f06a58cc8217191f718d1bca811dba62711460e8b3"} Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.443424 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" event={"ID":"994a125d-9d16-4821-9ab8-9575cb5edab8","Type":"ContainerStarted","Data":"15ec6df3d3b7008a87736e7270574c69807e5a72a991bf40bbc6f72e24d9f008"} Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.443439 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.444515 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" event={"ID":"dac73184-1338-4c52-9db2-16cf72732c5d","Type":"ContainerStarted","Data":"e3f98f74e3a6f12b41eee6e84500ceb2cd58823e2ede258ac58701f2beaf728a"} Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.444549 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" event={"ID":"dac73184-1338-4c52-9db2-16cf72732c5d","Type":"ContainerStarted","Data":"fb485364818a92ac2db9155675fa3fdc05edfa730c34fb88e87f90d03c138c5d"} Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.444808 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.447726 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.447984 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.461338 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c96ff9756-ggxjv" podStartSLOduration=3.461327878 podStartE2EDuration="3.461327878s" podCreationTimestamp="2026-01-21 09:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:41.458860623 +0000 UTC m=+200.209327941" watchObservedRunningTime="2026-01-21 09:06:41.461327878 +0000 UTC m=+200.211795195" Jan 21 09:06:41 crc kubenswrapper[4618]: I0121 09:06:41.478519 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-848b6b6c7f-tn4sf" podStartSLOduration=3.478500027 podStartE2EDuration="3.478500027s" podCreationTimestamp="2026-01-21 09:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:41.471630007 +0000 UTC m=+200.222097325" watchObservedRunningTime="2026-01-21 09:06:41.478500027 +0000 UTC m=+200.228967344" Jan 21 09:06:50 crc kubenswrapper[4618]: I0121 09:06:50.710989 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerName="oauth-openshift" containerID="cri-o://2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6" gracePeriod=15 Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.075511 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213494 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213547 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213570 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213612 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213660 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213689 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213710 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213735 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213757 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213784 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213822 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213843 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9lk2\" (UniqueName: \"kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213871 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213886 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.213905 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca\") pod \"57e2a16d-ea83-4e99-844e-089ccba97f47\" (UID: \"57e2a16d-ea83-4e99-844e-089ccba97f47\") " Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.214064 4618 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.214278 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.214411 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.214744 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.214842 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.219122 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.219386 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.219466 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2" (OuterVolumeSpecName: "kube-api-access-j9lk2") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "kube-api-access-j9lk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.219621 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.220046 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.220758 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.220887 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.221089 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.221227 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "57e2a16d-ea83-4e99-844e-089ccba97f47" (UID: "57e2a16d-ea83-4e99-844e-089ccba97f47"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314843 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314885 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314897 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9lk2\" (UniqueName: \"kubernetes.io/projected/57e2a16d-ea83-4e99-844e-089ccba97f47-kube-api-access-j9lk2\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314906 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314918 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314926 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314934 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314944 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314953 4618 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57e2a16d-ea83-4e99-844e-089ccba97f47-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314961 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314969 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314978 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.314986 4618 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57e2a16d-ea83-4e99-844e-089ccba97f47-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.489302 4618 generic.go:334] "Generic (PLEG): container finished" podID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerID="2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6" exitCode=0 Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.489354 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" event={"ID":"57e2a16d-ea83-4e99-844e-089ccba97f47","Type":"ContainerDied","Data":"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6"} Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.489387 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" event={"ID":"57e2a16d-ea83-4e99-844e-089ccba97f47","Type":"ContainerDied","Data":"0b34bb744e5ece1e83c13da7480e6c48ed6027512711ba27ff867cf6609e60be"} Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.489408 4618 scope.go:117] "RemoveContainer" containerID="2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.489559 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fnvx2" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.506639 4618 scope.go:117] "RemoveContainer" containerID="2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6" Jan 21 09:06:51 crc kubenswrapper[4618]: E0121 09:06:51.507099 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6\": container with ID starting with 2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6 not found: ID does not exist" containerID="2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.507154 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6"} err="failed to get container status \"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6\": rpc error: code = NotFound desc = could not find container \"2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6\": container with ID starting with 2e6d0a6fd4969929fbba78c83c93cab81f8d7032ecec4d62e4181d5485f633d6 not found: ID does not exist" Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.518165 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.520650 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fnvx2"] Jan 21 09:06:51 crc kubenswrapper[4618]: I0121 09:06:51.542531 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" path="/var/lib/kubelet/pods/57e2a16d-ea83-4e99-844e-089ccba97f47/volumes" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.012343 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d79d86965-m8wqc"] Jan 21 09:06:52 crc kubenswrapper[4618]: E0121 09:06:52.012570 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerName="oauth-openshift" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.012582 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerName="oauth-openshift" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.012679 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e2a16d-ea83-4e99-844e-089ccba97f47" containerName="oauth-openshift" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.013044 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.014622 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.014812 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.015393 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.015660 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.016175 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.016368 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.016372 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.016497 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.016734 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.017354 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.017867 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.019314 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.021665 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79d86965-m8wqc"] Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.022377 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023005 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023053 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023079 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023119 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023134 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023175 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-audit-policies\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023195 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023268 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023298 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-session\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023315 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023349 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a231743-f7cb-4954-ab42-d05274dd351b-audit-dir\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023367 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023398 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4vzk\" (UniqueName: \"kubernetes.io/projected/6a231743-f7cb-4954-ab42-d05274dd351b-kube-api-access-p4vzk\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.023453 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.028552 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124101 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124157 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-session\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124180 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124202 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a231743-f7cb-4954-ab42-d05274dd351b-audit-dir\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124222 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124240 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4vzk\" (UniqueName: \"kubernetes.io/projected/6a231743-f7cb-4954-ab42-d05274dd351b-kube-api-access-p4vzk\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124260 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124279 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124297 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124321 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124335 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124337 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a231743-f7cb-4954-ab42-d05274dd351b-audit-dir\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124349 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124365 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-audit-policies\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.124383 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.125938 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-audit-policies\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.126092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.126270 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.126288 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.129502 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136486 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136552 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136602 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136680 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-session\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136750 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136776 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.136491 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a231743-f7cb-4954-ab42-d05274dd351b-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.138368 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4vzk\" (UniqueName: \"kubernetes.io/projected/6a231743-f7cb-4954-ab42-d05274dd351b-kube-api-access-p4vzk\") pod \"oauth-openshift-5d79d86965-m8wqc\" (UID: \"6a231743-f7cb-4954-ab42-d05274dd351b\") " pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.325076 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:52 crc kubenswrapper[4618]: I0121 09:06:52.650613 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79d86965-m8wqc"] Jan 21 09:06:53 crc kubenswrapper[4618]: I0121 09:06:53.499256 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" event={"ID":"6a231743-f7cb-4954-ab42-d05274dd351b","Type":"ContainerStarted","Data":"f1deb03b9f8a8cc7f12547ec4cbd8f3c3cb653289ca151b59435fa3fa509be09"} Jan 21 09:06:53 crc kubenswrapper[4618]: I0121 09:06:53.499559 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:53 crc kubenswrapper[4618]: I0121 09:06:53.499570 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" event={"ID":"6a231743-f7cb-4954-ab42-d05274dd351b","Type":"ContainerStarted","Data":"a30ea3e55fee3b7f74214d00001cfe150b84a43081aea273eab60d91a0781cff"} Jan 21 09:06:53 crc kubenswrapper[4618]: I0121 09:06:53.504006 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" Jan 21 09:06:53 crc kubenswrapper[4618]: I0121 09:06:53.513219 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d79d86965-m8wqc" podStartSLOduration=28.513200378 podStartE2EDuration="28.513200378s" podCreationTimestamp="2026-01-21 09:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:06:53.512363332 +0000 UTC m=+212.262830649" watchObservedRunningTime="2026-01-21 09:06:53.513200378 +0000 UTC m=+212.263667694" Jan 21 09:06:56 crc kubenswrapper[4618]: I0121 09:06:56.959078 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:06:56 crc kubenswrapper[4618]: I0121 09:06:56.959362 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:06:56 crc kubenswrapper[4618]: I0121 09:06:56.959401 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:06:56 crc kubenswrapper[4618]: I0121 09:06:56.959883 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:06:56 crc kubenswrapper[4618]: I0121 09:06:56.959946 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b" gracePeriod=600 Jan 21 09:06:57 crc kubenswrapper[4618]: I0121 09:06:57.517422 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b" exitCode=0 Jan 21 09:06:57 crc kubenswrapper[4618]: I0121 09:06:57.517503 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b"} Jan 21 09:06:57 crc kubenswrapper[4618]: I0121 09:06:57.517648 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206"} Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.986714 4618 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988241 4618 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988388 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988446 4618 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988629 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56" gracePeriod=15 Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988695 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5" gracePeriod=15 Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988768 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b" gracePeriod=15 Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988741 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92" gracePeriod=15 Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988815 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4" gracePeriod=15 Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.988966 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.988986 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989003 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989009 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989017 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989023 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989030 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989035 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989043 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989048 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989062 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989067 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 09:07:07 crc kubenswrapper[4618]: E0121 09:07:07.989074 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989081 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989172 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989183 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989191 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989200 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989206 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.989213 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 09:07:07 crc kubenswrapper[4618]: I0121 09:07:07.993422 4618 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.018207 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184236 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184286 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184321 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184335 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184348 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184364 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184494 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.184555 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285389 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285438 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285455 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285470 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285484 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285505 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285564 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285513 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285596 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285630 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285610 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285691 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285734 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.285777 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.315308 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:08 crc kubenswrapper[4618]: W0121 09:07:08.331224 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-165d133ed709e05bed69d600fd06d9b955d5c8faac57aa25840b0412bc57f61c WatchSource:0}: Error finding container 165d133ed709e05bed69d600fd06d9b955d5c8faac57aa25840b0412bc57f61c: Status 404 returned error can't find the container with id 165d133ed709e05bed69d600fd06d9b955d5c8faac57aa25840b0412bc57f61c Jan 21 09:07:08 crc kubenswrapper[4618]: E0121 09:07:08.333968 4618 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.98:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cb3c8b943bf83 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 09:07:08.332818307 +0000 UTC m=+227.083285624,LastTimestamp:2026-01-21 09:07:08.332818307 +0000 UTC m=+227.083285624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.561099 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240"} Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.561154 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"165d133ed709e05bed69d600fd06d9b955d5c8faac57aa25840b0412bc57f61c"} Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.561714 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.563106 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.564224 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.564939 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92" exitCode=0 Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.564961 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b" exitCode=0 Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.564969 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4" exitCode=0 Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.564975 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5" exitCode=2 Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.565031 4618 scope.go:117] "RemoveContainer" containerID="97923b52f9c6e36b247b36b543f0e9513a7d62d4ca937b0e2c85c3a1f5891b47" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.566409 4618 generic.go:334] "Generic (PLEG): container finished" podID="71cc31c5-e39a-4571-b350-c9532b7752de" containerID="e938c79fcaa724f998f9a632314e3929d9c83d29d82aaa7cbd01d4327eb3c5ab" exitCode=0 Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.566443 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71cc31c5-e39a-4571-b350-c9532b7752de","Type":"ContainerDied","Data":"e938c79fcaa724f998f9a632314e3929d9c83d29d82aaa7cbd01d4327eb3c5ab"} Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.566765 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:08 crc kubenswrapper[4618]: I0121 09:07:08.567064 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:09 crc kubenswrapper[4618]: I0121 09:07:09.571855 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 09:07:09 crc kubenswrapper[4618]: I0121 09:07:09.838434 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:07:09 crc kubenswrapper[4618]: I0121 09:07:09.838786 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:09 crc kubenswrapper[4618]: I0121 09:07:09.839080 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.015795 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access\") pod \"71cc31c5-e39a-4571-b350-c9532b7752de\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.016394 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock\") pod \"71cc31c5-e39a-4571-b350-c9532b7752de\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.016450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir\") pod \"71cc31c5-e39a-4571-b350-c9532b7752de\" (UID: \"71cc31c5-e39a-4571-b350-c9532b7752de\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.016539 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock" (OuterVolumeSpecName: "var-lock") pod "71cc31c5-e39a-4571-b350-c9532b7752de" (UID: "71cc31c5-e39a-4571-b350-c9532b7752de"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.016553 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71cc31c5-e39a-4571-b350-c9532b7752de" (UID: "71cc31c5-e39a-4571-b350-c9532b7752de"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.018848 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71cc31c5-e39a-4571-b350-c9532b7752de" (UID: "71cc31c5-e39a-4571-b350-c9532b7752de"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.117910 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71cc31c5-e39a-4571-b350-c9532b7752de-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.118113 4618 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.118122 4618 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71cc31c5-e39a-4571-b350-c9532b7752de-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.235597 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.236295 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.236886 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.237216 4618 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.237512 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319443 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319513 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319540 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319561 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319581 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319694 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319855 4618 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319871 4618 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.319879 4618 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.577530 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.578005 4618 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56" exitCode=0 Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.578061 4618 scope.go:117] "RemoveContainer" containerID="5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.578076 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.579486 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71cc31c5-e39a-4571-b350-c9532b7752de","Type":"ContainerDied","Data":"9a3cdf46d05c90f6600359d116a3a40a9b3b41b05746473bb77e13e6d68d6ece"} Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.579514 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3cdf46d05c90f6600359d116a3a40a9b3b41b05746473bb77e13e6d68d6ece" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.579517 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.588492 4618 scope.go:117] "RemoveContainer" containerID="2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.590533 4618 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.590842 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.591173 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.591413 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.591598 4618 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.591837 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.597984 4618 scope.go:117] "RemoveContainer" containerID="d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.605940 4618 scope.go:117] "RemoveContainer" containerID="7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.613195 4618 scope.go:117] "RemoveContainer" containerID="87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.624041 4618 scope.go:117] "RemoveContainer" containerID="4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.636872 4618 scope.go:117] "RemoveContainer" containerID="5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.637119 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\": container with ID starting with 5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92 not found: ID does not exist" containerID="5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637162 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92"} err="failed to get container status \"5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\": rpc error: code = NotFound desc = could not find container \"5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92\": container with ID starting with 5c3856321639d5007b44a61726aa0baf5723898717ee3ae21c0fdb0a13637b92 not found: ID does not exist" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637180 4618 scope.go:117] "RemoveContainer" containerID="2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.637434 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\": container with ID starting with 2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b not found: ID does not exist" containerID="2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637480 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b"} err="failed to get container status \"2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\": rpc error: code = NotFound desc = could not find container \"2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b\": container with ID starting with 2009592fe38150dc24cef2f27ab24c2c2b9497f14dfd207f3a0c3dad7f11dc4b not found: ID does not exist" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637499 4618 scope.go:117] "RemoveContainer" containerID="d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.637933 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\": container with ID starting with d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4 not found: ID does not exist" containerID="d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637965 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4"} err="failed to get container status \"d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\": rpc error: code = NotFound desc = could not find container \"d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4\": container with ID starting with d60f52172b849db6bc6d338d004ff75fc055a177bb81070893322d3a51d0a7a4 not found: ID does not exist" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.637998 4618 scope.go:117] "RemoveContainer" containerID="7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.638286 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\": container with ID starting with 7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5 not found: ID does not exist" containerID="7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.638307 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5"} err="failed to get container status \"7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\": rpc error: code = NotFound desc = could not find container \"7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5\": container with ID starting with 7ddb7d57485d9833c2feb19edca7afe7a7c9a1eaf7eee7ad90ad623729d7c7b5 not found: ID does not exist" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.638322 4618 scope.go:117] "RemoveContainer" containerID="87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.638566 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\": container with ID starting with 87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56 not found: ID does not exist" containerID="87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.638598 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56"} err="failed to get container status \"87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\": rpc error: code = NotFound desc = could not find container \"87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56\": container with ID starting with 87c0892c41ac0da54f2b1dc446f7d4cf4a44c406e8122b962eb700f952153e56 not found: ID does not exist" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.638611 4618 scope.go:117] "RemoveContainer" containerID="4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649" Jan 21 09:07:10 crc kubenswrapper[4618]: E0121 09:07:10.638859 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\": container with ID starting with 4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649 not found: ID does not exist" containerID="4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649" Jan 21 09:07:10 crc kubenswrapper[4618]: I0121 09:07:10.638881 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649"} err="failed to get container status \"4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\": rpc error: code = NotFound desc = could not find container \"4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649\": container with ID starting with 4adb414cf456c2dab13d6a81a0c4f4e7b56980e9cd600ef7ab4ca8936004f649 not found: ID does not exist" Jan 21 09:07:11 crc kubenswrapper[4618]: I0121 09:07:11.538766 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:11 crc kubenswrapper[4618]: I0121 09:07:11.539252 4618 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:11 crc kubenswrapper[4618]: I0121 09:07:11.539533 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:11 crc kubenswrapper[4618]: I0121 09:07:11.543347 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 21 09:07:15 crc kubenswrapper[4618]: E0121 09:07:15.567186 4618 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.25.98:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" volumeName="registry-storage" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.250502 4618 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.250973 4618 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.251245 4618 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.251423 4618 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.251826 4618 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: I0121 09:07:16.251940 4618 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.252594 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="200ms" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.453529 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="400ms" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.854744 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="800ms" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.856138 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:07:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:07:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:07:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-21T09:07:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.856378 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.856563 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.856836 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.857034 4618 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:16 crc kubenswrapper[4618]: E0121 09:07:16.857051 4618 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 21 09:07:17 crc kubenswrapper[4618]: E0121 09:07:17.655770 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="1.6s" Jan 21 09:07:18 crc kubenswrapper[4618]: E0121 09:07:18.048755 4618 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.98:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188cb3c8b943bf83 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 09:07:08.332818307 +0000 UTC m=+227.083285624,LastTimestamp:2026-01-21 09:07:08.332818307 +0000 UTC m=+227.083285624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 09:07:19 crc kubenswrapper[4618]: E0121 09:07:19.256272 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="3.2s" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.536901 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.538977 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.539241 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.540359 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.540599 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.549313 4618 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.549337 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:21 crc kubenswrapper[4618]: E0121 09:07:21.549567 4618 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.549839 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:21 crc kubenswrapper[4618]: W0121 09:07:21.564373 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8bada9f490afb09a8a0272e0631be3ffdef90628a6c969d042cb7cb3a309328b WatchSource:0}: Error finding container 8bada9f490afb09a8a0272e0631be3ffdef90628a6c969d042cb7cb3a309328b: Status 404 returned error can't find the container with id 8bada9f490afb09a8a0272e0631be3ffdef90628a6c969d042cb7cb3a309328b Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.622917 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.622974 4618 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49" exitCode=1 Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.623029 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49"} Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.623454 4618 scope.go:117] "RemoveContainer" containerID="e48bba4ca7ffd8762ce42dbcd77bbc76556f31efc21593467c30ea09f6024b49" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.623665 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.624003 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.624228 4618 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:21 crc kubenswrapper[4618]: I0121 09:07:21.625377 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8bada9f490afb09a8a0272e0631be3ffdef90628a6c969d042cb7cb3a309328b"} Jan 21 09:07:22 crc kubenswrapper[4618]: E0121 09:07:22.457577 4618 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.98:6443: connect: connection refused" interval="6.4s" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.631492 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.631566 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cd96d4647c72b1a7dac10aa8b837702faaf3c325b6ecc59db8486c55db8f0350"} Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.632171 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.632406 4618 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.632612 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.633208 4618 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b25f17b3efb2cdbbb950728bc71494fa512d2d01573cb730bec1ae9ee180433b" exitCode=0 Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.633244 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b25f17b3efb2cdbbb950728bc71494fa512d2d01573cb730bec1ae9ee180433b"} Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.633435 4618 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.633452 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.633755 4618 status_manager.go:851] "Failed to get status for pod" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:22 crc kubenswrapper[4618]: E0121 09:07:22.633898 4618 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.634252 4618 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:22 crc kubenswrapper[4618]: I0121 09:07:22.634487 4618 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.25.98:6443: connect: connection refused" Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638807 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"36e19cf970ba9efd2f673b9fbd44588425fb1594b84084afdccda4959f098b56"} Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638846 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5db57abfc1eac1f937ad65d1744f6e9cb3d778c7cad211690abbdbef48e3768b"} Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638856 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2243d317ed297fded1c44207daae3b0fbdf7c7c11092d46a30cae0a119fd4d77"} Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638865 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"adff903a6a033282d2d983e5f49911ea9e2256f6daa6d1720895581168caac4f"} Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638874 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7cb4d1ae1bc6e43ff3465a50422d2240800725b7d21acaa1ac3299103e476154"} Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.638982 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.639081 4618 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:23 crc kubenswrapper[4618]: I0121 09:07:23.639101 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:26 crc kubenswrapper[4618]: I0121 09:07:26.550929 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:26 crc kubenswrapper[4618]: I0121 09:07:26.551136 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:26 crc kubenswrapper[4618]: I0121 09:07:26.554329 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:26 crc kubenswrapper[4618]: I0121 09:07:26.755689 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:07:28 crc kubenswrapper[4618]: I0121 09:07:28.508646 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:07:28 crc kubenswrapper[4618]: I0121 09:07:28.511979 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:07:28 crc kubenswrapper[4618]: I0121 09:07:28.813969 4618 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:29 crc kubenswrapper[4618]: I0121 09:07:29.662209 4618 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:29 crc kubenswrapper[4618]: I0121 09:07:29.662249 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:29 crc kubenswrapper[4618]: I0121 09:07:29.665796 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:30 crc kubenswrapper[4618]: I0121 09:07:30.666022 4618 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:30 crc kubenswrapper[4618]: I0121 09:07:30.666051 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="597ab3f9-d061-44c9-9a42-5abcfa77a11d" Jan 21 09:07:31 crc kubenswrapper[4618]: I0121 09:07:31.548841 4618 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="834e56a6-61fe-4806-b7c1-e9b805fa6af3" Jan 21 09:07:36 crc kubenswrapper[4618]: I0121 09:07:36.759804 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 09:07:38 crc kubenswrapper[4618]: I0121 09:07:38.478746 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 09:07:38 crc kubenswrapper[4618]: I0121 09:07:38.726069 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 09:07:38 crc kubenswrapper[4618]: I0121 09:07:38.726274 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 09:07:39 crc kubenswrapper[4618]: I0121 09:07:39.254344 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 09:07:39 crc kubenswrapper[4618]: I0121 09:07:39.560230 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 09:07:39 crc kubenswrapper[4618]: I0121 09:07:39.677705 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 09:07:39 crc kubenswrapper[4618]: I0121 09:07:39.890807 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 09:07:39 crc kubenswrapper[4618]: I0121 09:07:39.894374 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 09:07:40 crc kubenswrapper[4618]: I0121 09:07:40.282394 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 09:07:40 crc kubenswrapper[4618]: I0121 09:07:40.481606 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 09:07:40 crc kubenswrapper[4618]: I0121 09:07:40.937872 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.031227 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.160225 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.200030 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.200074 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.343477 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.375494 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.401565 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.402616 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.500691 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.570462 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.579441 4618 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.710343 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.751460 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.765242 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.797820 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 09:07:41 crc kubenswrapper[4618]: I0121 09:07:41.953062 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.047737 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.060725 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.087834 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.250867 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.362052 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.367083 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.670873 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.778099 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.951829 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 09:07:42 crc kubenswrapper[4618]: I0121 09:07:42.984086 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.194396 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.235256 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.406604 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.505529 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.650214 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.669945 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.768665 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.807649 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.859563 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.891605 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 09:07:43 crc kubenswrapper[4618]: I0121 09:07:43.944281 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.297539 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.404861 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.426471 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.448950 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.484353 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.501710 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.563501 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.658916 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.692956 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.788429 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.832931 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.865633 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.882486 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.904391 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 09:07:44 crc kubenswrapper[4618]: I0121 09:07:44.920269 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.003730 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.103601 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.196429 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.260303 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.325509 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.358045 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.394713 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.394720 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.458036 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.466280 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.552317 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.600775 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.613978 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.637208 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.656559 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.676959 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.681471 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.777247 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.841799 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.919133 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.921427 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.946888 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 09:07:45 crc kubenswrapper[4618]: I0121 09:07:45.988130 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.003350 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.024123 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.121134 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.135031 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.338972 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.567520 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.631016 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.677405 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.695578 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.763881 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.772092 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.790299 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.803355 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.863400 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.910099 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.916414 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 09:07:46 crc kubenswrapper[4618]: I0121 09:07:46.978233 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.084997 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.203367 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.213569 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.287519 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.309906 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.347744 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.372095 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.387040 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.396217 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.440056 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.569385 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.631220 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.722750 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.772733 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 09:07:47 crc kubenswrapper[4618]: I0121 09:07:47.935328 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.001164 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.023031 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.056514 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.061902 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.113317 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.190866 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.221493 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.292548 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.322304 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.347228 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.369525 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.495198 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.533851 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.535842 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.536292 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.553213 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.559775 4618 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.571733 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.715554 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.749339 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.762389 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.817654 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.824892 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.861800 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.951399 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.952391 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 09:07:48 crc kubenswrapper[4618]: I0121 09:07:48.985503 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.006170 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.119714 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.177650 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.222612 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.234902 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.264167 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.349335 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.353342 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.356277 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.408996 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.419135 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.457705 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.463129 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.530455 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.550055 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.559004 4618 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.562118 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.56209272 podStartE2EDuration="41.56209272s" podCreationTimestamp="2026-01-21 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:07:28.867873911 +0000 UTC m=+247.618341228" watchObservedRunningTime="2026-01-21 09:07:49.56209272 +0000 UTC m=+268.312560037" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.564915 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.564997 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.568734 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.579290 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.579271943 podStartE2EDuration="21.579271943s" podCreationTimestamp="2026-01-21 09:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:07:49.577322796 +0000 UTC m=+268.327790113" watchObservedRunningTime="2026-01-21 09:07:49.579271943 +0000 UTC m=+268.329739260" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.618580 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.701530 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.717667 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.837488 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.931255 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 09:07:49 crc kubenswrapper[4618]: I0121 09:07:49.935750 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.057804 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.105222 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.187653 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.194827 4618 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.195215 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240" gracePeriod=5 Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.241207 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.278943 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.359220 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.395713 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.560880 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.586021 4618 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.681464 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.693724 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.854152 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.862569 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.931259 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.953555 4618 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.955719 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.958940 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 09:07:50 crc kubenswrapper[4618]: I0121 09:07:50.996387 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.104358 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.111551 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.115582 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.228952 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.242866 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.333832 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.382707 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.396335 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.396511 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.450529 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.505901 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.506978 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.568036 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.609511 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.690386 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.708126 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.806463 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.841492 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.913343 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 09:07:51 crc kubenswrapper[4618]: I0121 09:07:51.991184 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.063402 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.130110 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.145074 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.164908 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.179514 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.243818 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.352028 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.427741 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.531774 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.618911 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.721435 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.907588 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.973182 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 09:07:52 crc kubenswrapper[4618]: I0121 09:07:52.983398 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.065019 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.103575 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.130655 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.230107 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.347850 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.423448 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.470493 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.573745 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.574108 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.702819 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.715233 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.729433 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.803639 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.947946 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 09:07:53 crc kubenswrapper[4618]: I0121 09:07:53.992447 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.032509 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.153048 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.176451 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.249188 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.255855 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.291874 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.714883 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.727749 4618 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.882573 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 09:07:54 crc kubenswrapper[4618]: I0121 09:07:54.911567 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.149620 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.240549 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.566091 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.638632 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.749835 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.749909 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.779811 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.779857 4618 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240" exitCode=137 Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.779905 4618 scope.go:117] "RemoveContainer" containerID="fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.779917 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.790236 4618 scope.go:117] "RemoveContainer" containerID="fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240" Jan 21 09:07:55 crc kubenswrapper[4618]: E0121 09:07:55.790461 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240\": container with ID starting with fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240 not found: ID does not exist" containerID="fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.790490 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240"} err="failed to get container status \"fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240\": rpc error: code = NotFound desc = could not find container \"fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240\": container with ID starting with fe0b3de792ef9446c9de50e5d5db20e0902fe5b1ea53fda93f67f0b409f2d240 not found: ID does not exist" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819575 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819628 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819661 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819680 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819721 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819704 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819786 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819810 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.819740 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.821645 4618 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.821689 4618 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.821704 4618 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.821716 4618 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.825975 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:07:55 crc kubenswrapper[4618]: I0121 09:07:55.923581 4618 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 09:07:56 crc kubenswrapper[4618]: I0121 09:07:56.457223 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 09:07:56 crc kubenswrapper[4618]: I0121 09:07:56.667431 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 09:07:56 crc kubenswrapper[4618]: I0121 09:07:56.912635 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.542653 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.543325 4618 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.549717 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.549741 4618 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="21ec8943-534e-4f47-81f7-cb8984871f33" Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.553368 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.553469 4618 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="21ec8943-534e-4f47-81f7-cb8984871f33" Jan 21 09:07:57 crc kubenswrapper[4618]: I0121 09:07:57.614078 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.575904 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.576605 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cwzlc" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="registry-server" containerID="cri-o://453a79e35c186154f3fb060a24b1d797a2c494f654a8abb255ddc31ea17934b7" gracePeriod=30 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.579036 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.579286 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qr6lr" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="registry-server" containerID="cri-o://7100f5900f56f994582a364d2fa1ec70980642ea217cdfcbf00f26ff4c6fa65f" gracePeriod=30 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.588522 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.588749 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerName="marketplace-operator" containerID="cri-o://452a6e9e37b1b5ed72e29a0a789a22568e187987e0fe300d313f915d3551551a" gracePeriod=30 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.593410 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.593880 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t62l5" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="registry-server" containerID="cri-o://6f9ae7fe1e02f33a55bf147ede316b8ccb4b0d18ad00ec80a56931a521a67e7c" gracePeriod=30 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.611494 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.611686 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8wcpt" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="registry-server" containerID="cri-o://88ee925e0a6adafa342155f3a239f9c3a4e30fb8938b28d7876c99094dcb395e" gracePeriod=30 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.642958 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mpc9"] Jan 21 09:08:00 crc kubenswrapper[4618]: E0121 09:08:00.643809 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" containerName="installer" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.643835 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" containerName="installer" Jan 21 09:08:00 crc kubenswrapper[4618]: E0121 09:08:00.643900 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.643910 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.644353 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc31c5-e39a-4571-b350-c9532b7752de" containerName="installer" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.644411 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.651544 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.669810 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mpc9"] Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.675962 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftlfp\" (UniqueName: \"kubernetes.io/projected/9eb45d53-b317-4346-9a4e-679ff4473d3d-kube-api-access-ftlfp\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.676044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.676068 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.776824 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftlfp\" (UniqueName: \"kubernetes.io/projected/9eb45d53-b317-4346-9a4e-679ff4473d3d-kube-api-access-ftlfp\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.777094 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.777119 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.778362 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.783674 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9eb45d53-b317-4346-9a4e-679ff4473d3d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.790827 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftlfp\" (UniqueName: \"kubernetes.io/projected/9eb45d53-b317-4346-9a4e-679ff4473d3d-kube-api-access-ftlfp\") pod \"marketplace-operator-79b997595-4mpc9\" (UID: \"9eb45d53-b317-4346-9a4e-679ff4473d3d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.804701 4618 generic.go:334] "Generic (PLEG): container finished" podID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerID="88ee925e0a6adafa342155f3a239f9c3a4e30fb8938b28d7876c99094dcb395e" exitCode=0 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.804767 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerDied","Data":"88ee925e0a6adafa342155f3a239f9c3a4e30fb8938b28d7876c99094dcb395e"} Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.806470 4618 generic.go:334] "Generic (PLEG): container finished" podID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerID="6f9ae7fe1e02f33a55bf147ede316b8ccb4b0d18ad00ec80a56931a521a67e7c" exitCode=0 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.806523 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerDied","Data":"6f9ae7fe1e02f33a55bf147ede316b8ccb4b0d18ad00ec80a56931a521a67e7c"} Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.808470 4618 generic.go:334] "Generic (PLEG): container finished" podID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerID="7100f5900f56f994582a364d2fa1ec70980642ea217cdfcbf00f26ff4c6fa65f" exitCode=0 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.808530 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerDied","Data":"7100f5900f56f994582a364d2fa1ec70980642ea217cdfcbf00f26ff4c6fa65f"} Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.810402 4618 generic.go:334] "Generic (PLEG): container finished" podID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerID="453a79e35c186154f3fb060a24b1d797a2c494f654a8abb255ddc31ea17934b7" exitCode=0 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.810486 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerDied","Data":"453a79e35c186154f3fb060a24b1d797a2c494f654a8abb255ddc31ea17934b7"} Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.811758 4618 generic.go:334] "Generic (PLEG): container finished" podID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerID="452a6e9e37b1b5ed72e29a0a789a22568e187987e0fe300d313f915d3551551a" exitCode=0 Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.811803 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" event={"ID":"7867ab6f-4cdb-492d-9106-b1f42a66b62e","Type":"ContainerDied","Data":"452a6e9e37b1b5ed72e29a0a789a22568e187987e0fe300d313f915d3551551a"} Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.993192 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:00 crc kubenswrapper[4618]: I0121 09:08:00.996565 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.002068 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.005034 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.020968 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.023885 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca\") pod \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080498 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities\") pod \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080524 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tfv7\" (UniqueName: \"kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7\") pod \"c1c156bc-1694-457b-b26e-c46d6b5be62d\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080563 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics\") pod \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080625 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7rl5\" (UniqueName: \"kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5\") pod \"c4468290-050b-4a6a-9388-cbbae3c71d68\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080639 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities\") pod \"18ec235a-04ac-489e-92cd-e1e69c8a1074\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080680 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content\") pod \"c4468290-050b-4a6a-9388-cbbae3c71d68\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080706 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhdp9\" (UniqueName: \"kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9\") pod \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080728 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k227f\" (UniqueName: \"kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f\") pod \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\" (UID: \"7867ab6f-4cdb-492d-9106-b1f42a66b62e\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080746 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content\") pod \"18ec235a-04ac-489e-92cd-e1e69c8a1074\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080779 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqhz8\" (UniqueName: \"kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8\") pod \"18ec235a-04ac-489e-92cd-e1e69c8a1074\" (UID: \"18ec235a-04ac-489e-92cd-e1e69c8a1074\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080819 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities\") pod \"c1c156bc-1694-457b-b26e-c46d6b5be62d\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080834 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content\") pod \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\" (UID: \"69e6a09f-0983-4b1b-83a7-13e8acd56f61\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080857 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content\") pod \"c1c156bc-1694-457b-b26e-c46d6b5be62d\" (UID: \"c1c156bc-1694-457b-b26e-c46d6b5be62d\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.080881 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities\") pod \"c4468290-050b-4a6a-9388-cbbae3c71d68\" (UID: \"c4468290-050b-4a6a-9388-cbbae3c71d68\") " Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.081107 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7867ab6f-4cdb-492d-9106-b1f42a66b62e" (UID: "7867ab6f-4cdb-492d-9106-b1f42a66b62e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.081213 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities" (OuterVolumeSpecName: "utilities") pod "69e6a09f-0983-4b1b-83a7-13e8acd56f61" (UID: "69e6a09f-0983-4b1b-83a7-13e8acd56f61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.081912 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities" (OuterVolumeSpecName: "utilities") pod "c4468290-050b-4a6a-9388-cbbae3c71d68" (UID: "c4468290-050b-4a6a-9388-cbbae3c71d68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.083120 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities" (OuterVolumeSpecName: "utilities") pod "c1c156bc-1694-457b-b26e-c46d6b5be62d" (UID: "c1c156bc-1694-457b-b26e-c46d6b5be62d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.084677 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities" (OuterVolumeSpecName: "utilities") pod "18ec235a-04ac-489e-92cd-e1e69c8a1074" (UID: "18ec235a-04ac-489e-92cd-e1e69c8a1074"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.087476 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9" (OuterVolumeSpecName: "kube-api-access-lhdp9") pod "69e6a09f-0983-4b1b-83a7-13e8acd56f61" (UID: "69e6a09f-0983-4b1b-83a7-13e8acd56f61"). InnerVolumeSpecName "kube-api-access-lhdp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.087835 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f" (OuterVolumeSpecName: "kube-api-access-k227f") pod "7867ab6f-4cdb-492d-9106-b1f42a66b62e" (UID: "7867ab6f-4cdb-492d-9106-b1f42a66b62e"). InnerVolumeSpecName "kube-api-access-k227f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.089374 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8" (OuterVolumeSpecName: "kube-api-access-nqhz8") pod "18ec235a-04ac-489e-92cd-e1e69c8a1074" (UID: "18ec235a-04ac-489e-92cd-e1e69c8a1074"). InnerVolumeSpecName "kube-api-access-nqhz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.091018 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7" (OuterVolumeSpecName: "kube-api-access-8tfv7") pod "c1c156bc-1694-457b-b26e-c46d6b5be62d" (UID: "c1c156bc-1694-457b-b26e-c46d6b5be62d"). InnerVolumeSpecName "kube-api-access-8tfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.091198 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7867ab6f-4cdb-492d-9106-b1f42a66b62e" (UID: "7867ab6f-4cdb-492d-9106-b1f42a66b62e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.092643 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5" (OuterVolumeSpecName: "kube-api-access-l7rl5") pod "c4468290-050b-4a6a-9388-cbbae3c71d68" (UID: "c4468290-050b-4a6a-9388-cbbae3c71d68"). InnerVolumeSpecName "kube-api-access-l7rl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.107047 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69e6a09f-0983-4b1b-83a7-13e8acd56f61" (UID: "69e6a09f-0983-4b1b-83a7-13e8acd56f61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.137250 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ec235a-04ac-489e-92cd-e1e69c8a1074" (UID: "18ec235a-04ac-489e-92cd-e1e69c8a1074"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.143303 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1c156bc-1694-457b-b26e-c46d6b5be62d" (UID: "c1c156bc-1694-457b-b26e-c46d6b5be62d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182162 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182188 4618 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182199 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tfv7\" (UniqueName: \"kubernetes.io/projected/c1c156bc-1694-457b-b26e-c46d6b5be62d-kube-api-access-8tfv7\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182209 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182217 4618 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7867ab6f-4cdb-492d-9106-b1f42a66b62e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182228 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7rl5\" (UniqueName: \"kubernetes.io/projected/c4468290-050b-4a6a-9388-cbbae3c71d68-kube-api-access-l7rl5\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182236 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182243 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhdp9\" (UniqueName: \"kubernetes.io/projected/69e6a09f-0983-4b1b-83a7-13e8acd56f61-kube-api-access-lhdp9\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182251 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k227f\" (UniqueName: \"kubernetes.io/projected/7867ab6f-4cdb-492d-9106-b1f42a66b62e-kube-api-access-k227f\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182259 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ec235a-04ac-489e-92cd-e1e69c8a1074-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182267 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqhz8\" (UniqueName: \"kubernetes.io/projected/18ec235a-04ac-489e-92cd-e1e69c8a1074-kube-api-access-nqhz8\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182275 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182282 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e6a09f-0983-4b1b-83a7-13e8acd56f61-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.182291 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c156bc-1694-457b-b26e-c46d6b5be62d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.187100 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4468290-050b-4a6a-9388-cbbae3c71d68" (UID: "c4468290-050b-4a6a-9388-cbbae3c71d68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.283513 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4468290-050b-4a6a-9388-cbbae3c71d68-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.368767 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mpc9"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.817516 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8wcpt" event={"ID":"c4468290-050b-4a6a-9388-cbbae3c71d68","Type":"ContainerDied","Data":"8b09f5eccd18505be023c5fc33be11f43cbc85fc4a5acd5e3d1c383538a75143"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.817555 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8wcpt" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.817579 4618 scope.go:117] "RemoveContainer" containerID="88ee925e0a6adafa342155f3a239f9c3a4e30fb8938b28d7876c99094dcb395e" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.820589 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t62l5" event={"ID":"69e6a09f-0983-4b1b-83a7-13e8acd56f61","Type":"ContainerDied","Data":"5ce6d88a26a24a04a53b410db725b0989e541c2f7f713c6ec13b684650cd0b25"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.820612 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t62l5" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.823938 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr6lr" event={"ID":"c1c156bc-1694-457b-b26e-c46d6b5be62d","Type":"ContainerDied","Data":"b70794f7d98e400c9d9c82da1afb31c35d76a6e1013b3b95bdd0a6ff3471bd39"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.824036 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr6lr" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.826342 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cwzlc" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.826749 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cwzlc" event={"ID":"18ec235a-04ac-489e-92cd-e1e69c8a1074","Type":"ContainerDied","Data":"559d7e24931e29cff667e8f91a5b743504047817e20aefd28c0764c76d380bed"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.828057 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.828206 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-55flm" event={"ID":"7867ab6f-4cdb-492d-9106-b1f42a66b62e","Type":"ContainerDied","Data":"348a5cccdfc016a870cdcdeabe33352ad6734437168d47404d41164b063aee5d"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.829044 4618 scope.go:117] "RemoveContainer" containerID="86ebd49fb8b00ce2884395e2e24468ad725ad46a3841b29d0000918a583c9fe1" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.829496 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" event={"ID":"9eb45d53-b317-4346-9a4e-679ff4473d3d","Type":"ContainerStarted","Data":"a1dd1b57379b3b0e1ec0de29dc8c790658dcbe2c77b40c6582134fed6d973dfc"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.829524 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" event={"ID":"9eb45d53-b317-4346-9a4e-679ff4473d3d","Type":"ContainerStarted","Data":"6de388993c50478ce49ec2a8faa62e97842563066a86ce0bcbf88818180f2d94"} Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.829539 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.834616 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.836969 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.841196 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8wcpt"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.844233 4618 scope.go:117] "RemoveContainer" containerID="b7a7c0a85806504fe8ab35ba83b3635f0832feb7165b524a435ab4bd85c54a1f" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.854963 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.857625 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t62l5"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.858790 4618 scope.go:117] "RemoveContainer" containerID="6f9ae7fe1e02f33a55bf147ede316b8ccb4b0d18ad00ec80a56931a521a67e7c" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.868848 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4mpc9" podStartSLOduration=1.868826371 podStartE2EDuration="1.868826371s" podCreationTimestamp="2026-01-21 09:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:08:01.868299612 +0000 UTC m=+280.618766929" watchObservedRunningTime="2026-01-21 09:08:01.868826371 +0000 UTC m=+280.619293687" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.879108 4618 scope.go:117] "RemoveContainer" containerID="2731be176de8b206a5f4c69d59b92d3eda352165b8913271a4da42cbd20b3e42" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.885727 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.891074 4618 scope.go:117] "RemoveContainer" containerID="7c983e09b1696df8f740c8fad1dfe8f0c36e2333eef937ace1997bfc128d2470" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.894257 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cwzlc"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.904555 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.910525 4618 scope.go:117] "RemoveContainer" containerID="7100f5900f56f994582a364d2fa1ec70980642ea217cdfcbf00f26ff4c6fa65f" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.911560 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qr6lr"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.915484 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.917641 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-55flm"] Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.920709 4618 scope.go:117] "RemoveContainer" containerID="1921801e602f242bf4d79e42f06e2355a29dcd078924473c6c9bf01064f8702a" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.930367 4618 scope.go:117] "RemoveContainer" containerID="0d5f8b657611de154f50378be5e8f2d4d336eb8019911cacab7c438655120611" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.939683 4618 scope.go:117] "RemoveContainer" containerID="453a79e35c186154f3fb060a24b1d797a2c494f654a8abb255ddc31ea17934b7" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.947607 4618 scope.go:117] "RemoveContainer" containerID="5141c89656827dc6862cc75abb093172b412ffcb6e17a94eb048bb487f83b1fc" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.957253 4618 scope.go:117] "RemoveContainer" containerID="f875f165f57a949746f8fc769af7177f6da0358d1122111684e937c07f050bb2" Jan 21 09:08:01 crc kubenswrapper[4618]: I0121 09:08:01.966753 4618 scope.go:117] "RemoveContainer" containerID="452a6e9e37b1b5ed72e29a0a789a22568e187987e0fe300d313f915d3551551a" Jan 21 09:08:03 crc kubenswrapper[4618]: I0121 09:08:03.542100 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" path="/var/lib/kubelet/pods/18ec235a-04ac-489e-92cd-e1e69c8a1074/volumes" Jan 21 09:08:03 crc kubenswrapper[4618]: I0121 09:08:03.542952 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" path="/var/lib/kubelet/pods/69e6a09f-0983-4b1b-83a7-13e8acd56f61/volumes" Jan 21 09:08:03 crc kubenswrapper[4618]: I0121 09:08:03.543502 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" path="/var/lib/kubelet/pods/7867ab6f-4cdb-492d-9106-b1f42a66b62e/volumes" Jan 21 09:08:03 crc kubenswrapper[4618]: I0121 09:08:03.543932 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" path="/var/lib/kubelet/pods/c1c156bc-1694-457b-b26e-c46d6b5be62d/volumes" Jan 21 09:08:03 crc kubenswrapper[4618]: I0121 09:08:03.544435 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" path="/var/lib/kubelet/pods/c4468290-050b-4a6a-9388-cbbae3c71d68/volumes" Jan 21 09:08:21 crc kubenswrapper[4618]: I0121 09:08:21.441449 4618 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.187550 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9svvz"] Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.188674 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.188751 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.188807 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.188880 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.188925 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.188971 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189014 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189055 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189099 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189137 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189210 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189254 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189299 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189345 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189387 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189429 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189474 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189516 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="extract-utilities" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189558 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189599 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189641 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189683 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="extract-content" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189732 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerName="marketplace-operator" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189776 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerName="marketplace-operator" Jan 21 09:09:25 crc kubenswrapper[4618]: E0121 09:09:25.189829 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189873 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.189998 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e6a09f-0983-4b1b-83a7-13e8acd56f61" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.190043 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7867ab6f-4cdb-492d-9106-b1f42a66b62e" containerName="marketplace-operator" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.190088 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec235a-04ac-489e-92cd-e1e69c8a1074" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.190134 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4468290-050b-4a6a-9388-cbbae3c71d68" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.190199 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c156bc-1694-457b-b26e-c46d6b5be62d" containerName="registry-server" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.190826 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.193522 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.198308 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9svvz"] Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.219225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfht\" (UniqueName: \"kubernetes.io/projected/4e29e499-2283-4105-bcf5-73ae74791ce6-kube-api-access-vcfht\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.219271 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-catalog-content\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.219388 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-utilities\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.319992 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfht\" (UniqueName: \"kubernetes.io/projected/4e29e499-2283-4105-bcf5-73ae74791ce6-kube-api-access-vcfht\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.320045 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-catalog-content\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.320102 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-utilities\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.320495 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-utilities\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.320571 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e29e499-2283-4105-bcf5-73ae74791ce6-catalog-content\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.334132 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfht\" (UniqueName: \"kubernetes.io/projected/4e29e499-2283-4105-bcf5-73ae74791ce6-kube-api-access-vcfht\") pod \"community-operators-9svvz\" (UID: \"4e29e499-2283-4105-bcf5-73ae74791ce6\") " pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.385366 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69lfl"] Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.386861 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.389794 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.390916 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69lfl"] Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.506604 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.522438 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-catalog-content\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.522636 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-utilities\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.522664 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x6p\" (UniqueName: \"kubernetes.io/projected/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-kube-api-access-d2x6p\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.623540 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-catalog-content\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.623580 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-utilities\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.624042 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x6p\" (UniqueName: \"kubernetes.io/projected/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-kube-api-access-d2x6p\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.624562 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-utilities\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.625604 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-catalog-content\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.638049 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x6p\" (UniqueName: \"kubernetes.io/projected/4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5-kube-api-access-d2x6p\") pod \"redhat-marketplace-69lfl\" (UID: \"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5\") " pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.697896 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:25 crc kubenswrapper[4618]: I0121 09:09:25.833878 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9svvz"] Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.019007 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69lfl"] Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.137286 4618 generic.go:334] "Generic (PLEG): container finished" podID="4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5" containerID="aefd3dc427ccafb05b4dcffff8c8f9d4e70ff563bacbf2b2be8555f73c8fc625" exitCode=0 Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.137382 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69lfl" event={"ID":"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5","Type":"ContainerDied","Data":"aefd3dc427ccafb05b4dcffff8c8f9d4e70ff563bacbf2b2be8555f73c8fc625"} Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.137421 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69lfl" event={"ID":"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5","Type":"ContainerStarted","Data":"77a5d1b665856b7ee0a6a57f525b0504ebafd1d69d96eea5c44c2f78037592c5"} Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.140218 4618 generic.go:334] "Generic (PLEG): container finished" podID="4e29e499-2283-4105-bcf5-73ae74791ce6" containerID="f3d1e801321ab56300ed5016a491cf12051119f0744354f07c42460e900ae9a9" exitCode=0 Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.140241 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svvz" event={"ID":"4e29e499-2283-4105-bcf5-73ae74791ce6","Type":"ContainerDied","Data":"f3d1e801321ab56300ed5016a491cf12051119f0744354f07c42460e900ae9a9"} Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.140264 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svvz" event={"ID":"4e29e499-2283-4105-bcf5-73ae74791ce6","Type":"ContainerStarted","Data":"c3c447813ccbf33fc7bb4690c35b9483dc5671fbccc30be24981d61028db5686"} Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.959118 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:09:26 crc kubenswrapper[4618]: I0121 09:09:26.959193 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.145192 4618 generic.go:334] "Generic (PLEG): container finished" podID="4e29e499-2283-4105-bcf5-73ae74791ce6" containerID="c0540d53c931512a91f5e27233684796c5af143da6431ac6d487d6438521b50b" exitCode=0 Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.145258 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svvz" event={"ID":"4e29e499-2283-4105-bcf5-73ae74791ce6","Type":"ContainerDied","Data":"c0540d53c931512a91f5e27233684796c5af143da6431ac6d487d6438521b50b"} Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.586155 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdxtq"] Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.586999 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.589694 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.595083 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxtq"] Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.744797 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-utilities\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.744867 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-catalog-content\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.744933 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcm7\" (UniqueName: \"kubernetes.io/projected/bc646d50-9435-404e-9b80-42ad016be4f9-kube-api-access-gwcm7\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.784609 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xcsv8"] Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.785450 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.786999 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.793477 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcsv8"] Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.846598 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-utilities\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.846654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-catalog-content\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.846784 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcm7\" (UniqueName: \"kubernetes.io/projected/bc646d50-9435-404e-9b80-42ad016be4f9-kube-api-access-gwcm7\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.847077 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-utilities\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.847157 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc646d50-9435-404e-9b80-42ad016be4f9-catalog-content\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.862917 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcm7\" (UniqueName: \"kubernetes.io/projected/bc646d50-9435-404e-9b80-42ad016be4f9-kube-api-access-gwcm7\") pod \"redhat-operators-xdxtq\" (UID: \"bc646d50-9435-404e-9b80-42ad016be4f9\") " pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.897844 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.948247 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm998\" (UniqueName: \"kubernetes.io/projected/38d4879c-3ab9-4282-9d58-263cfb585759-kube-api-access-sm998\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.948548 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-utilities\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:27 crc kubenswrapper[4618]: I0121 09:09:27.948759 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-catalog-content\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.011111 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-klh8s"] Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.011702 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.019342 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-klh8s"] Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.049457 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-catalog-content\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.049535 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm998\" (UniqueName: \"kubernetes.io/projected/38d4879c-3ab9-4282-9d58-263cfb585759-kube-api-access-sm998\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.049556 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-utilities\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.050151 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-utilities\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.050353 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d4879c-3ab9-4282-9d58-263cfb585759-catalog-content\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.071800 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm998\" (UniqueName: \"kubernetes.io/projected/38d4879c-3ab9-4282-9d58-263cfb585759-kube-api-access-sm998\") pod \"certified-operators-xcsv8\" (UID: \"38d4879c-3ab9-4282-9d58-263cfb585759\") " pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.095506 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxtq"] Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.096523 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:28 crc kubenswrapper[4618]: W0121 09:09:28.101064 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc646d50_9435_404e_9b80_42ad016be4f9.slice/crio-b6ac80a024f309a8ba5e2393ea4f77392b167d630cb2dd0d8a23b4b55f767165 WatchSource:0}: Error finding container b6ac80a024f309a8ba5e2393ea4f77392b167d630cb2dd0d8a23b4b55f767165: Status 404 returned error can't find the container with id b6ac80a024f309a8ba5e2393ea4f77392b167d630cb2dd0d8a23b4b55f767165 Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.150903 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.150949 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-trusted-ca\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.150970 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cff6d10-1ca9-4f91-a3df-859a315bc505-ca-trust-extracted\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.151014 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-certificates\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.151047 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-tls\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.151074 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-bound-sa-token\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.151098 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtcr4\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-kube-api-access-vtcr4\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.151157 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cff6d10-1ca9-4f91-a3df-859a315bc505-installation-pull-secrets\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.165925 4618 generic.go:334] "Generic (PLEG): container finished" podID="4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5" containerID="44ca6eef2b558f577401b387a66fa240b33bdaa45ce08cd30621a9cabe9145a2" exitCode=0 Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.166036 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69lfl" event={"ID":"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5","Type":"ContainerDied","Data":"44ca6eef2b558f577401b387a66fa240b33bdaa45ce08cd30621a9cabe9145a2"} Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.168795 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxtq" event={"ID":"bc646d50-9435-404e-9b80-42ad016be4f9","Type":"ContainerStarted","Data":"b6ac80a024f309a8ba5e2393ea4f77392b167d630cb2dd0d8a23b4b55f767165"} Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.173771 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9svvz" event={"ID":"4e29e499-2283-4105-bcf5-73ae74791ce6","Type":"ContainerStarted","Data":"23be4de418e35c2ab1f3909d8a910662b061d8b86ae6dca0081007288049825a"} Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.177099 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.192817 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9svvz" podStartSLOduration=1.713814867 podStartE2EDuration="3.192796937s" podCreationTimestamp="2026-01-21 09:09:25 +0000 UTC" firstStartedPulling="2026-01-21 09:09:26.14165467 +0000 UTC m=+364.892121988" lastFinishedPulling="2026-01-21 09:09:27.62063674 +0000 UTC m=+366.371104058" observedRunningTime="2026-01-21 09:09:28.191231501 +0000 UTC m=+366.941698818" watchObservedRunningTime="2026-01-21 09:09:28.192796937 +0000 UTC m=+366.943264254" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.246353 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcsv8"] Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252368 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cff6d10-1ca9-4f91-a3df-859a315bc505-ca-trust-extracted\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252428 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-certificates\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252462 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-tls\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252484 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-bound-sa-token\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252504 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtcr4\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-kube-api-access-vtcr4\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252536 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cff6d10-1ca9-4f91-a3df-859a315bc505-installation-pull-secrets\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.252894 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-trusted-ca\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.253196 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cff6d10-1ca9-4f91-a3df-859a315bc505-ca-trust-extracted\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.254778 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-trusted-ca\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.255005 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-certificates\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.256452 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-registry-tls\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.256492 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cff6d10-1ca9-4f91-a3df-859a315bc505-installation-pull-secrets\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.265618 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtcr4\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-kube-api-access-vtcr4\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.266795 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cff6d10-1ca9-4f91-a3df-859a315bc505-bound-sa-token\") pod \"image-registry-66df7c8f76-klh8s\" (UID: \"4cff6d10-1ca9-4f91-a3df-859a315bc505\") " pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: W0121 09:09:28.275805 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d4879c_3ab9_4282_9d58_263cfb585759.slice/crio-1db3acabeb08fe606a5aa85ca9eeaaa69fb3f0259f1099b1e7853c1b421d63b9 WatchSource:0}: Error finding container 1db3acabeb08fe606a5aa85ca9eeaaa69fb3f0259f1099b1e7853c1b421d63b9: Status 404 returned error can't find the container with id 1db3acabeb08fe606a5aa85ca9eeaaa69fb3f0259f1099b1e7853c1b421d63b9 Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.328108 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:28 crc kubenswrapper[4618]: I0121 09:09:28.679600 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-klh8s"] Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.181043 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69lfl" event={"ID":"4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5","Type":"ContainerStarted","Data":"a31ebb82707fd397779def17d01491cda16d7a91338f52860f6fea0d2f39b805"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.183638 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" event={"ID":"4cff6d10-1ca9-4f91-a3df-859a315bc505","Type":"ContainerStarted","Data":"0512bbbe41a8e68a88c41fb4ce57383c214ffd3f5f0688615ab18e08bf1186a5"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.183675 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" event={"ID":"4cff6d10-1ca9-4f91-a3df-859a315bc505","Type":"ContainerStarted","Data":"41d062f43aac3914972d81a83f1dac118d9c6ad19cd451dc62785205e4fa7ccc"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.183765 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.185201 4618 generic.go:334] "Generic (PLEG): container finished" podID="bc646d50-9435-404e-9b80-42ad016be4f9" containerID="2fe1fb32dec924b51e824a56d445dc416f7c7a594e4c03d26f5ee92018430432" exitCode=0 Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.185280 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxtq" event={"ID":"bc646d50-9435-404e-9b80-42ad016be4f9","Type":"ContainerDied","Data":"2fe1fb32dec924b51e824a56d445dc416f7c7a594e4c03d26f5ee92018430432"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.186667 4618 generic.go:334] "Generic (PLEG): container finished" podID="38d4879c-3ab9-4282-9d58-263cfb585759" containerID="092c85d7c21313c166ccef8926fb9c2cbcc1f2e79371f129a1ecb43ec6521207" exitCode=0 Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.186733 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcsv8" event={"ID":"38d4879c-3ab9-4282-9d58-263cfb585759","Type":"ContainerDied","Data":"092c85d7c21313c166ccef8926fb9c2cbcc1f2e79371f129a1ecb43ec6521207"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.186764 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcsv8" event={"ID":"38d4879c-3ab9-4282-9d58-263cfb585759","Type":"ContainerStarted","Data":"1db3acabeb08fe606a5aa85ca9eeaaa69fb3f0259f1099b1e7853c1b421d63b9"} Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.201101 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69lfl" podStartSLOduration=1.420568072 podStartE2EDuration="4.201086696s" podCreationTimestamp="2026-01-21 09:09:25 +0000 UTC" firstStartedPulling="2026-01-21 09:09:26.138280613 +0000 UTC m=+364.888747931" lastFinishedPulling="2026-01-21 09:09:28.918799238 +0000 UTC m=+367.669266555" observedRunningTime="2026-01-21 09:09:29.199051118 +0000 UTC m=+367.949518434" watchObservedRunningTime="2026-01-21 09:09:29.201086696 +0000 UTC m=+367.951554014" Jan 21 09:09:29 crc kubenswrapper[4618]: I0121 09:09:29.241683 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" podStartSLOduration=2.241660538 podStartE2EDuration="2.241660538s" podCreationTimestamp="2026-01-21 09:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:09:29.239849913 +0000 UTC m=+367.990317230" watchObservedRunningTime="2026-01-21 09:09:29.241660538 +0000 UTC m=+367.992127855" Jan 21 09:09:30 crc kubenswrapper[4618]: I0121 09:09:30.192956 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxtq" event={"ID":"bc646d50-9435-404e-9b80-42ad016be4f9","Type":"ContainerStarted","Data":"96a9c993eb072907f24428b8a3a6fe5076344b212d4d2960a64c1a28714f398b"} Jan 21 09:09:31 crc kubenswrapper[4618]: I0121 09:09:31.199530 4618 generic.go:334] "Generic (PLEG): container finished" podID="bc646d50-9435-404e-9b80-42ad016be4f9" containerID="96a9c993eb072907f24428b8a3a6fe5076344b212d4d2960a64c1a28714f398b" exitCode=0 Jan 21 09:09:31 crc kubenswrapper[4618]: I0121 09:09:31.199643 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxtq" event={"ID":"bc646d50-9435-404e-9b80-42ad016be4f9","Type":"ContainerDied","Data":"96a9c993eb072907f24428b8a3a6fe5076344b212d4d2960a64c1a28714f398b"} Jan 21 09:09:31 crc kubenswrapper[4618]: I0121 09:09:31.202369 4618 generic.go:334] "Generic (PLEG): container finished" podID="38d4879c-3ab9-4282-9d58-263cfb585759" containerID="d8cf29eb8ac66bc329584e8c8fb9d8c2bb8487d7bf0c1969a866157a5c87a7c0" exitCode=0 Jan 21 09:09:31 crc kubenswrapper[4618]: I0121 09:09:31.202413 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcsv8" event={"ID":"38d4879c-3ab9-4282-9d58-263cfb585759","Type":"ContainerDied","Data":"d8cf29eb8ac66bc329584e8c8fb9d8c2bb8487d7bf0c1969a866157a5c87a7c0"} Jan 21 09:09:32 crc kubenswrapper[4618]: I0121 09:09:32.209844 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcsv8" event={"ID":"38d4879c-3ab9-4282-9d58-263cfb585759","Type":"ContainerStarted","Data":"82b4c720aeaca2e2b4cf3962f57f414e68415fdda64b3b1a628c1a605030bbfc"} Jan 21 09:09:32 crc kubenswrapper[4618]: I0121 09:09:32.212006 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxtq" event={"ID":"bc646d50-9435-404e-9b80-42ad016be4f9","Type":"ContainerStarted","Data":"10a3cb553e61e28238eb42c865d58cae7a354a5f72292b0d1c4def26c462cef8"} Jan 21 09:09:32 crc kubenswrapper[4618]: I0121 09:09:32.244395 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdxtq" podStartSLOduration=2.776954647 podStartE2EDuration="5.244370468s" podCreationTimestamp="2026-01-21 09:09:27 +0000 UTC" firstStartedPulling="2026-01-21 09:09:29.186302377 +0000 UTC m=+367.936769693" lastFinishedPulling="2026-01-21 09:09:31.653718207 +0000 UTC m=+370.404185514" observedRunningTime="2026-01-21 09:09:32.240651983 +0000 UTC m=+370.991119300" watchObservedRunningTime="2026-01-21 09:09:32.244370468 +0000 UTC m=+370.994837785" Jan 21 09:09:32 crc kubenswrapper[4618]: I0121 09:09:32.244818 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xcsv8" podStartSLOduration=2.641337123 podStartE2EDuration="5.244806988s" podCreationTimestamp="2026-01-21 09:09:27 +0000 UTC" firstStartedPulling="2026-01-21 09:09:29.188562789 +0000 UTC m=+367.939030106" lastFinishedPulling="2026-01-21 09:09:31.792032654 +0000 UTC m=+370.542499971" observedRunningTime="2026-01-21 09:09:32.224169628 +0000 UTC m=+370.974636945" watchObservedRunningTime="2026-01-21 09:09:32.244806988 +0000 UTC m=+370.995274305" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.506972 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.507283 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.542444 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.698385 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.698430 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:35 crc kubenswrapper[4618]: I0121 09:09:35.724283 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:36 crc kubenswrapper[4618]: I0121 09:09:36.253204 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9svvz" Jan 21 09:09:36 crc kubenswrapper[4618]: I0121 09:09:36.253258 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69lfl" Jan 21 09:09:37 crc kubenswrapper[4618]: I0121 09:09:37.898040 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:37 crc kubenswrapper[4618]: I0121 09:09:37.898088 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:37 crc kubenswrapper[4618]: I0121 09:09:37.924687 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:38 crc kubenswrapper[4618]: I0121 09:09:38.097697 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:38 crc kubenswrapper[4618]: I0121 09:09:38.097731 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:38 crc kubenswrapper[4618]: I0121 09:09:38.124563 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:38 crc kubenswrapper[4618]: I0121 09:09:38.261063 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xcsv8" Jan 21 09:09:38 crc kubenswrapper[4618]: I0121 09:09:38.265467 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xdxtq" Jan 21 09:09:48 crc kubenswrapper[4618]: I0121 09:09:48.332430 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-klh8s" Jan 21 09:09:48 crc kubenswrapper[4618]: I0121 09:09:48.367317 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:09:56 crc kubenswrapper[4618]: I0121 09:09:56.958802 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:09:56 crc kubenswrapper[4618]: I0121 09:09:56.958921 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.390194 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" podUID="ad054293-342a-4919-b938-6032654fbc53" containerName="registry" containerID="cri-o://1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356" gracePeriod=30 Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.647765 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845216 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845377 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845409 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845427 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq8b9\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845500 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845518 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.845537 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted\") pod \"ad054293-342a-4919-b938-6032654fbc53\" (UID: \"ad054293-342a-4919-b938-6032654fbc53\") " Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.846083 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.846452 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.850603 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.850701 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.850937 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.851499 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9" (OuterVolumeSpecName: "kube-api-access-xq8b9") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "kube-api-access-xq8b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.852083 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.858204 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ad054293-342a-4919-b938-6032654fbc53" (UID: "ad054293-342a-4919-b938-6032654fbc53"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946794 4618 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad054293-342a-4919-b938-6032654fbc53-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946818 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946827 4618 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946835 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq8b9\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-kube-api-access-xq8b9\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946844 4618 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad054293-342a-4919-b938-6032654fbc53-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946851 4618 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad054293-342a-4919-b938-6032654fbc53-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:13 crc kubenswrapper[4618]: I0121 09:10:13.946858 4618 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad054293-342a-4919-b938-6032654fbc53-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.377697 4618 generic.go:334] "Generic (PLEG): container finished" podID="ad054293-342a-4919-b938-6032654fbc53" containerID="1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356" exitCode=0 Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.377734 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" event={"ID":"ad054293-342a-4919-b938-6032654fbc53","Type":"ContainerDied","Data":"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356"} Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.377757 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" event={"ID":"ad054293-342a-4919-b938-6032654fbc53","Type":"ContainerDied","Data":"ee354de2302baa33d6a95142b735179115555b5aaf7a0faeca8bc072c4601a6d"} Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.377773 4618 scope.go:117] "RemoveContainer" containerID="1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356" Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.377794 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bkf4x" Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.389254 4618 scope.go:117] "RemoveContainer" containerID="1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356" Jan 21 09:10:14 crc kubenswrapper[4618]: E0121 09:10:14.389480 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356\": container with ID starting with 1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356 not found: ID does not exist" containerID="1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356" Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.389508 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356"} err="failed to get container status \"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356\": rpc error: code = NotFound desc = could not find container \"1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356\": container with ID starting with 1000cbda2772e71db3ec522c0aa6ce10c6d2af6fa54c2b2f0638f0377a981356 not found: ID does not exist" Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.396820 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:10:14 crc kubenswrapper[4618]: I0121 09:10:14.399075 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bkf4x"] Jan 21 09:10:15 crc kubenswrapper[4618]: I0121 09:10:15.541337 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad054293-342a-4919-b938-6032654fbc53" path="/var/lib/kubelet/pods/ad054293-342a-4919-b938-6032654fbc53/volumes" Jan 21 09:10:26 crc kubenswrapper[4618]: I0121 09:10:26.959074 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:10:26 crc kubenswrapper[4618]: I0121 09:10:26.959510 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:10:26 crc kubenswrapper[4618]: I0121 09:10:26.959556 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:10:26 crc kubenswrapper[4618]: I0121 09:10:26.960092 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:10:26 crc kubenswrapper[4618]: I0121 09:10:26.960161 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206" gracePeriod=600 Jan 21 09:10:27 crc kubenswrapper[4618]: I0121 09:10:27.425026 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206" exitCode=0 Jan 21 09:10:27 crc kubenswrapper[4618]: I0121 09:10:27.425102 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206"} Jan 21 09:10:27 crc kubenswrapper[4618]: I0121 09:10:27.425272 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23"} Jan 21 09:10:27 crc kubenswrapper[4618]: I0121 09:10:27.425295 4618 scope.go:117] "RemoveContainer" containerID="cf3c0bd9f7d382de7870358c21af9a3d1e06ec2bd574d7d1f893f55c4f060d9b" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.157920 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm"] Jan 21 09:12:21 crc kubenswrapper[4618]: E0121 09:12:21.159436 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad054293-342a-4919-b938-6032654fbc53" containerName="registry" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.159455 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad054293-342a-4919-b938-6032654fbc53" containerName="registry" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.159548 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad054293-342a-4919-b938-6032654fbc53" containerName="registry" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.167912 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.171292 4618 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-q6gp8" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.171504 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.171655 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.176191 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.180318 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-vv9rr"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.180923 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vv9rr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.182781 4618 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9hmmr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.190950 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vv9rr"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.200835 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-q6frw"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.201896 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.204410 4618 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-22ss2" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.216173 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-q6frw"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.291026 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mtvg\" (UniqueName: \"kubernetes.io/projected/a23d36e0-6e5d-4cc6-a21c-9d6a114e7158-kube-api-access-9mtvg\") pod \"cert-manager-cainjector-cf98fcc89-j6lvm\" (UID: \"a23d36e0-6e5d-4cc6-a21c-9d6a114e7158\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.291123 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7tb\" (UniqueName: \"kubernetes.io/projected/d736c899-0a94-4fb8-9e97-077345f1a8b7-kube-api-access-hs7tb\") pod \"cert-manager-webhook-687f57d79b-q6frw\" (UID: \"d736c899-0a94-4fb8-9e97-077345f1a8b7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.291169 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfrw\" (UniqueName: \"kubernetes.io/projected/d9674a2f-8cdc-4165-b8e0-9cfc0914d17f-kube-api-access-fcfrw\") pod \"cert-manager-858654f9db-vv9rr\" (UID: \"d9674a2f-8cdc-4165-b8e0-9cfc0914d17f\") " pod="cert-manager/cert-manager-858654f9db-vv9rr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.392470 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7tb\" (UniqueName: \"kubernetes.io/projected/d736c899-0a94-4fb8-9e97-077345f1a8b7-kube-api-access-hs7tb\") pod \"cert-manager-webhook-687f57d79b-q6frw\" (UID: \"d736c899-0a94-4fb8-9e97-077345f1a8b7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.392519 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfrw\" (UniqueName: \"kubernetes.io/projected/d9674a2f-8cdc-4165-b8e0-9cfc0914d17f-kube-api-access-fcfrw\") pod \"cert-manager-858654f9db-vv9rr\" (UID: \"d9674a2f-8cdc-4165-b8e0-9cfc0914d17f\") " pod="cert-manager/cert-manager-858654f9db-vv9rr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.392602 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mtvg\" (UniqueName: \"kubernetes.io/projected/a23d36e0-6e5d-4cc6-a21c-9d6a114e7158-kube-api-access-9mtvg\") pod \"cert-manager-cainjector-cf98fcc89-j6lvm\" (UID: \"a23d36e0-6e5d-4cc6-a21c-9d6a114e7158\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.407925 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7tb\" (UniqueName: \"kubernetes.io/projected/d736c899-0a94-4fb8-9e97-077345f1a8b7-kube-api-access-hs7tb\") pod \"cert-manager-webhook-687f57d79b-q6frw\" (UID: \"d736c899-0a94-4fb8-9e97-077345f1a8b7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.407941 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfrw\" (UniqueName: \"kubernetes.io/projected/d9674a2f-8cdc-4165-b8e0-9cfc0914d17f-kube-api-access-fcfrw\") pod \"cert-manager-858654f9db-vv9rr\" (UID: \"d9674a2f-8cdc-4165-b8e0-9cfc0914d17f\") " pod="cert-manager/cert-manager-858654f9db-vv9rr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.408513 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mtvg\" (UniqueName: \"kubernetes.io/projected/a23d36e0-6e5d-4cc6-a21c-9d6a114e7158-kube-api-access-9mtvg\") pod \"cert-manager-cainjector-cf98fcc89-j6lvm\" (UID: \"a23d36e0-6e5d-4cc6-a21c-9d6a114e7158\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.488746 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.495346 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vv9rr" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.514576 4618 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-22ss2" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.522663 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.639869 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm"] Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.649356 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.670571 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vv9rr"] Jan 21 09:12:21 crc kubenswrapper[4618]: W0121 09:12:21.674283 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9674a2f_8cdc_4165_b8e0_9cfc0914d17f.slice/crio-5f2febcc4dd9484d596f3bfb5e602beae57cfe8330cc8e749a75e48cc0e30dd1 WatchSource:0}: Error finding container 5f2febcc4dd9484d596f3bfb5e602beae57cfe8330cc8e749a75e48cc0e30dd1: Status 404 returned error can't find the container with id 5f2febcc4dd9484d596f3bfb5e602beae57cfe8330cc8e749a75e48cc0e30dd1 Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.829705 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vv9rr" event={"ID":"d9674a2f-8cdc-4165-b8e0-9cfc0914d17f","Type":"ContainerStarted","Data":"5f2febcc4dd9484d596f3bfb5e602beae57cfe8330cc8e749a75e48cc0e30dd1"} Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.830674 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" event={"ID":"a23d36e0-6e5d-4cc6-a21c-9d6a114e7158","Type":"ContainerStarted","Data":"ce1e55ab0b264800ac8337ed5bde12e3abf12460c0c3799272318bee7b8129f3"} Jan 21 09:12:21 crc kubenswrapper[4618]: I0121 09:12:21.902917 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-q6frw"] Jan 21 09:12:21 crc kubenswrapper[4618]: W0121 09:12:21.905328 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd736c899_0a94_4fb8_9e97_077345f1a8b7.slice/crio-fb9c29ec749a8bce14df1be2a4d49f020f06dd2d1ec4370eab37d6406b8d2acd WatchSource:0}: Error finding container fb9c29ec749a8bce14df1be2a4d49f020f06dd2d1ec4370eab37d6406b8d2acd: Status 404 returned error can't find the container with id fb9c29ec749a8bce14df1be2a4d49f020f06dd2d1ec4370eab37d6406b8d2acd Jan 21 09:12:22 crc kubenswrapper[4618]: I0121 09:12:22.836218 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" event={"ID":"d736c899-0a94-4fb8-9e97-077345f1a8b7","Type":"ContainerStarted","Data":"fb9c29ec749a8bce14df1be2a4d49f020f06dd2d1ec4370eab37d6406b8d2acd"} Jan 21 09:12:23 crc kubenswrapper[4618]: I0121 09:12:23.851512 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" event={"ID":"a23d36e0-6e5d-4cc6-a21c-9d6a114e7158","Type":"ContainerStarted","Data":"11b80a4f23ab63c7a616377c2cb24e488cec4b5eceb4a5f856732131f2d2c556"} Jan 21 09:12:23 crc kubenswrapper[4618]: I0121 09:12:23.864076 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j6lvm" podStartSLOduration=1.249996297 podStartE2EDuration="2.864046217s" podCreationTimestamp="2026-01-21 09:12:21 +0000 UTC" firstStartedPulling="2026-01-21 09:12:21.648884616 +0000 UTC m=+540.399351952" lastFinishedPulling="2026-01-21 09:12:23.262934556 +0000 UTC m=+542.013401872" observedRunningTime="2026-01-21 09:12:23.86331345 +0000 UTC m=+542.613780768" watchObservedRunningTime="2026-01-21 09:12:23.864046217 +0000 UTC m=+542.614513534" Jan 21 09:12:24 crc kubenswrapper[4618]: I0121 09:12:24.857634 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" event={"ID":"d736c899-0a94-4fb8-9e97-077345f1a8b7","Type":"ContainerStarted","Data":"2115864cf164c8fe583ddda0b584b65bb9c962d621439f701fed4881bf13a74e"} Jan 21 09:12:24 crc kubenswrapper[4618]: I0121 09:12:24.859726 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vv9rr" event={"ID":"d9674a2f-8cdc-4165-b8e0-9cfc0914d17f","Type":"ContainerStarted","Data":"f5f04724af0bc8c9a4435a62cdeef4e391b5874562170e58764409231c26fffd"} Jan 21 09:12:24 crc kubenswrapper[4618]: I0121 09:12:24.869258 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" podStartSLOduration=1.423587533 podStartE2EDuration="3.869240996s" podCreationTimestamp="2026-01-21 09:12:21 +0000 UTC" firstStartedPulling="2026-01-21 09:12:21.906817247 +0000 UTC m=+540.657284565" lastFinishedPulling="2026-01-21 09:12:24.352470711 +0000 UTC m=+543.102938028" observedRunningTime="2026-01-21 09:12:24.867253993 +0000 UTC m=+543.617721300" watchObservedRunningTime="2026-01-21 09:12:24.869240996 +0000 UTC m=+543.619708313" Jan 21 09:12:24 crc kubenswrapper[4618]: I0121 09:12:24.876526 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-vv9rr" podStartSLOduration=1.204756331 podStartE2EDuration="3.876506719s" podCreationTimestamp="2026-01-21 09:12:21 +0000 UTC" firstStartedPulling="2026-01-21 09:12:21.6759337 +0000 UTC m=+540.426401017" lastFinishedPulling="2026-01-21 09:12:24.347684087 +0000 UTC m=+543.098151405" observedRunningTime="2026-01-21 09:12:24.875466393 +0000 UTC m=+543.625933700" watchObservedRunningTime="2026-01-21 09:12:24.876506719 +0000 UTC m=+543.626974036" Jan 21 09:12:25 crc kubenswrapper[4618]: I0121 09:12:25.863874 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:31 crc kubenswrapper[4618]: I0121 09:12:31.526588 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-q6frw" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069126 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-894tg"] Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069452 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-controller" containerID="cri-o://95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069520 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="nbdb" containerID="cri-o://641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069555 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="sbdb" containerID="cri-o://8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069596 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069644 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="northd" containerID="cri-o://ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069656 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-acl-logging" containerID="cri-o://c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.069711 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-node" containerID="cri-o://a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.094947 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" containerID="cri-o://4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" gracePeriod=30 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.306510 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/3.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.308544 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovn-acl-logging/0.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.308904 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovn-controller/0.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.309258 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350045 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lnh42"] Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350250 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350268 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350276 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="nbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350283 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="nbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350292 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350297 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350303 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="northd" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350308 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="northd" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350317 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kubecfg-setup" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350322 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kubecfg-setup" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350327 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-node" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350332 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-node" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350340 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350347 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350356 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="sbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350361 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="sbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350370 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350376 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350384 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350390 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350397 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350402 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350408 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350413 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.350421 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-acl-logging" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350426 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-acl-logging" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350498 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350506 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="nbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350513 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350520 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="sbdb" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350529 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350534 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350540 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350548 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-acl-logging" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350555 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="northd" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350562 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovn-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350568 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="kube-rbac-proxy-node" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.350712 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" containerName="ovnkube-controller" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.351901 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411604 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411653 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411697 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411741 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411719 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411764 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411791 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411788 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411779 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411808 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411847 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket" (OuterVolumeSpecName: "log-socket") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411890 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411926 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411945 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411976 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412000 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.411993 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412018 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412044 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c58lk\" (UniqueName: \"kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412074 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412083 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412098 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412114 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412121 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412160 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412168 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412181 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412219 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412235 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin\") pod \"992361e5-8eb9-426d-9eed-afffb0c30615\" (UID: \"992361e5-8eb9-426d-9eed-afffb0c30615\") " Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412181 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash" (OuterVolumeSpecName: "host-slash") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412201 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412233 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log" (OuterVolumeSpecName: "node-log") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412262 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412313 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412358 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412394 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412730 4618 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412758 4618 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412772 4618 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412786 4618 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-slash\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412798 4618 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412970 4618 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-node-log\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.412991 4618 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413003 4618 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413014 4618 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413025 4618 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413025 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413035 4618 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413076 4618 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413087 4618 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413097 4618 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-log-socket\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413106 4618 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.413115 4618 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.416949 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk" (OuterVolumeSpecName: "kube-api-access-c58lk") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "kube-api-access-c58lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.417095 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.423325 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "992361e5-8eb9-426d-9eed-afffb0c30615" (UID: "992361e5-8eb9-426d-9eed-afffb0c30615"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514611 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-ovn\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514665 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-slash\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514687 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-netd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514710 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-script-lib\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514835 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-config\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514935 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.514976 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-bin\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515007 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/174633cf-9bd3-463e-b103-2b545cbb9f06-ovn-node-metrics-cert\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515051 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-node-log\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515124 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-etc-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515205 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-systemd-units\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515245 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-systemd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515277 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515300 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-log-socket\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515331 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-kubelet\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515360 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vhqq\" (UniqueName: \"kubernetes.io/projected/174633cf-9bd3-463e-b103-2b545cbb9f06-kube-api-access-4vhqq\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515387 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-env-overrides\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515553 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-var-lib-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515628 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515696 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-netns\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515813 4618 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/992361e5-8eb9-426d-9eed-afffb0c30615-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515834 4618 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/992361e5-8eb9-426d-9eed-afffb0c30615-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515846 4618 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/992361e5-8eb9-426d-9eed-afffb0c30615-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.515857 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c58lk\" (UniqueName: \"kubernetes.io/projected/992361e5-8eb9-426d-9eed-afffb0c30615-kube-api-access-c58lk\") on node \"crc\" DevicePath \"\"" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617395 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-netns\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-ovn\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617493 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-slash\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617512 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-netd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617536 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-script-lib\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617559 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-netns\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617674 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-slash\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617690 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-ovn\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617754 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-netd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617561 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-config\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617930 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617976 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.617992 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-bin\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618042 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/174633cf-9bd3-463e-b103-2b545cbb9f06-ovn-node-metrics-cert\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618115 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-node-log\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618138 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-cni-bin\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618248 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-node-log\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618244 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-etc-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618280 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-etc-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618307 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-systemd-units\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618334 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-systemd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618361 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618379 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-log-socket\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618389 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-script-lib\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618408 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-systemd\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618396 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-kubelet\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618428 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-run-ovn-kubernetes\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618412 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-host-kubelet\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618450 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vhqq\" (UniqueName: \"kubernetes.io/projected/174633cf-9bd3-463e-b103-2b545cbb9f06-kube-api-access-4vhqq\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618456 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-log-socket\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618478 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-env-overrides\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618481 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-systemd-units\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618511 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-var-lib-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618537 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618538 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-ovnkube-config\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618610 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-run-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.618603 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/174633cf-9bd3-463e-b103-2b545cbb9f06-var-lib-openvswitch\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.619017 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/174633cf-9bd3-463e-b103-2b545cbb9f06-env-overrides\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.621127 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/174633cf-9bd3-463e-b103-2b545cbb9f06-ovn-node-metrics-cert\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.632930 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vhqq\" (UniqueName: \"kubernetes.io/projected/174633cf-9bd3-463e-b103-2b545cbb9f06-kube-api-access-4vhqq\") pod \"ovnkube-node-lnh42\" (UID: \"174633cf-9bd3-463e-b103-2b545cbb9f06\") " pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.664076 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.890563 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/2.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.891094 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/1.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.891133 4618 generic.go:334] "Generic (PLEG): container finished" podID="052a66c4-94ce-4336-93f6-1d0023e58cc4" containerID="5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a" exitCode=2 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.891211 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerDied","Data":"5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.891248 4618 scope.go:117] "RemoveContainer" containerID="0af2fdb0801a0d22f36c5b38722c459dd9910bbf426168b4f5599e87de278ba0" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.891571 4618 scope.go:117] "RemoveContainer" containerID="5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a" Jan 21 09:12:32 crc kubenswrapper[4618]: E0121 09:12:32.891729 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-m6jz5_openshift-multus(052a66c4-94ce-4336-93f6-1d0023e58cc4)\"" pod="openshift-multus/multus-m6jz5" podUID="052a66c4-94ce-4336-93f6-1d0023e58cc4" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.892451 4618 generic.go:334] "Generic (PLEG): container finished" podID="174633cf-9bd3-463e-b103-2b545cbb9f06" containerID="f22d80f70e1b6510c45fe645ec00f920bc6e14114d944704ea22f311ded8d28d" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.892515 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerDied","Data":"f22d80f70e1b6510c45fe645ec00f920bc6e14114d944704ea22f311ded8d28d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.892548 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"b72466fdc3ab8af6f65fa518f9be12f0205e98f6bf7d6f03a6e08bf9e9630dd8"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.894363 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovnkube-controller/3.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.896820 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovn-acl-logging/0.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897285 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-894tg_992361e5-8eb9-426d-9eed-afffb0c30615/ovn-controller/0.log" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897692 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897784 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897848 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897911 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897959 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898035 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" exitCode=0 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898103 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" exitCode=143 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898187 4618 generic.go:334] "Generic (PLEG): container finished" podID="992361e5-8eb9-426d-9eed-afffb0c30615" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" exitCode=143 Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897714 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898276 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898290 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898300 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898310 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898320 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.897763 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898329 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898340 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898345 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898350 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898356 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898361 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898365 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898370 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898375 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898379 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898386 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898393 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898398 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898404 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898408 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898413 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898419 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898423 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898427 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898431 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898436 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898442 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898451 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898456 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898460 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898465 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898469 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898474 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898479 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898483 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898489 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898493 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898499 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-894tg" event={"ID":"992361e5-8eb9-426d-9eed-afffb0c30615","Type":"ContainerDied","Data":"db7e2558f63516ef0c2a647348cc48b93e4a889b12023e6ba37db3dcad500151"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898506 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898512 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898516 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898521 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898526 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898530 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898535 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898539 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898544 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.898548 4618 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.911925 4618 scope.go:117] "RemoveContainer" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.932545 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.942229 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-894tg"] Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.946717 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-894tg"] Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.955418 4618 scope.go:117] "RemoveContainer" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.977305 4618 scope.go:117] "RemoveContainer" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:32 crc kubenswrapper[4618]: I0121 09:12:32.990981 4618 scope.go:117] "RemoveContainer" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.005078 4618 scope.go:117] "RemoveContainer" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.022067 4618 scope.go:117] "RemoveContainer" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.032768 4618 scope.go:117] "RemoveContainer" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.051553 4618 scope.go:117] "RemoveContainer" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.072890 4618 scope.go:117] "RemoveContainer" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.086771 4618 scope.go:117] "RemoveContainer" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.087198 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": container with ID starting with 4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4 not found: ID does not exist" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.087248 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} err="failed to get container status \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": rpc error: code = NotFound desc = could not find container \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": container with ID starting with 4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.087273 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.087731 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": container with ID starting with 47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b not found: ID does not exist" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.087766 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} err="failed to get container status \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": rpc error: code = NotFound desc = could not find container \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": container with ID starting with 47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.087789 4618 scope.go:117] "RemoveContainer" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.088167 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": container with ID starting with 8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723 not found: ID does not exist" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.088226 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} err="failed to get container status \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": rpc error: code = NotFound desc = could not find container \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": container with ID starting with 8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.088249 4618 scope.go:117] "RemoveContainer" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.088548 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": container with ID starting with 641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4 not found: ID does not exist" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.088575 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} err="failed to get container status \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": rpc error: code = NotFound desc = could not find container \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": container with ID starting with 641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.088593 4618 scope.go:117] "RemoveContainer" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.089010 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": container with ID starting with ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b not found: ID does not exist" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089043 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} err="failed to get container status \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": rpc error: code = NotFound desc = could not find container \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": container with ID starting with ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089069 4618 scope.go:117] "RemoveContainer" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.089417 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": container with ID starting with 59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f not found: ID does not exist" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089446 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} err="failed to get container status \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": rpc error: code = NotFound desc = could not find container \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": container with ID starting with 59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089467 4618 scope.go:117] "RemoveContainer" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.089794 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": container with ID starting with a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0 not found: ID does not exist" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089822 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} err="failed to get container status \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": rpc error: code = NotFound desc = could not find container \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": container with ID starting with a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.089840 4618 scope.go:117] "RemoveContainer" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.090068 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": container with ID starting with c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0 not found: ID does not exist" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090092 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} err="failed to get container status \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": rpc error: code = NotFound desc = could not find container \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": container with ID starting with c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090106 4618 scope.go:117] "RemoveContainer" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.090393 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": container with ID starting with 95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d not found: ID does not exist" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090414 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} err="failed to get container status \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": rpc error: code = NotFound desc = could not find container \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": container with ID starting with 95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090428 4618 scope.go:117] "RemoveContainer" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: E0121 09:12:33.090626 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": container with ID starting with e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702 not found: ID does not exist" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090646 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} err="failed to get container status \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": rpc error: code = NotFound desc = could not find container \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": container with ID starting with e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090659 4618 scope.go:117] "RemoveContainer" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090911 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} err="failed to get container status \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": rpc error: code = NotFound desc = could not find container \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": container with ID starting with 4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.090943 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.091191 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} err="failed to get container status \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": rpc error: code = NotFound desc = could not find container \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": container with ID starting with 47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.091216 4618 scope.go:117] "RemoveContainer" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.091528 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} err="failed to get container status \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": rpc error: code = NotFound desc = could not find container \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": container with ID starting with 8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.091610 4618 scope.go:117] "RemoveContainer" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.092502 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} err="failed to get container status \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": rpc error: code = NotFound desc = could not find container \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": container with ID starting with 641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.092528 4618 scope.go:117] "RemoveContainer" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.092808 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} err="failed to get container status \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": rpc error: code = NotFound desc = could not find container \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": container with ID starting with ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.092834 4618 scope.go:117] "RemoveContainer" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093171 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} err="failed to get container status \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": rpc error: code = NotFound desc = could not find container \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": container with ID starting with 59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093196 4618 scope.go:117] "RemoveContainer" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093447 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} err="failed to get container status \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": rpc error: code = NotFound desc = could not find container \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": container with ID starting with a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093468 4618 scope.go:117] "RemoveContainer" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093749 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} err="failed to get container status \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": rpc error: code = NotFound desc = could not find container \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": container with ID starting with c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093767 4618 scope.go:117] "RemoveContainer" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093950 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} err="failed to get container status \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": rpc error: code = NotFound desc = could not find container \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": container with ID starting with 95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.093979 4618 scope.go:117] "RemoveContainer" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.094766 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} err="failed to get container status \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": rpc error: code = NotFound desc = could not find container \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": container with ID starting with e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.094795 4618 scope.go:117] "RemoveContainer" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.095097 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} err="failed to get container status \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": rpc error: code = NotFound desc = could not find container \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": container with ID starting with 4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.095119 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.096329 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} err="failed to get container status \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": rpc error: code = NotFound desc = could not find container \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": container with ID starting with 47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.096391 4618 scope.go:117] "RemoveContainer" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.097470 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} err="failed to get container status \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": rpc error: code = NotFound desc = could not find container \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": container with ID starting with 8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.097498 4618 scope.go:117] "RemoveContainer" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.097772 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} err="failed to get container status \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": rpc error: code = NotFound desc = could not find container \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": container with ID starting with 641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.097797 4618 scope.go:117] "RemoveContainer" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098152 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} err="failed to get container status \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": rpc error: code = NotFound desc = could not find container \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": container with ID starting with ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098171 4618 scope.go:117] "RemoveContainer" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098463 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} err="failed to get container status \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": rpc error: code = NotFound desc = could not find container \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": container with ID starting with 59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098486 4618 scope.go:117] "RemoveContainer" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098745 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} err="failed to get container status \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": rpc error: code = NotFound desc = could not find container \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": container with ID starting with a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.098780 4618 scope.go:117] "RemoveContainer" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099065 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} err="failed to get container status \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": rpc error: code = NotFound desc = could not find container \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": container with ID starting with c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099104 4618 scope.go:117] "RemoveContainer" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099428 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} err="failed to get container status \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": rpc error: code = NotFound desc = could not find container \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": container with ID starting with 95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099456 4618 scope.go:117] "RemoveContainer" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099811 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} err="failed to get container status \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": rpc error: code = NotFound desc = could not find container \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": container with ID starting with e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.099843 4618 scope.go:117] "RemoveContainer" containerID="4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100108 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4"} err="failed to get container status \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": rpc error: code = NotFound desc = could not find container \"4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4\": container with ID starting with 4ee1361318272a0017b93fe2fc0f22816ad13d2ea5f0338f5249cc5a20eb60b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100133 4618 scope.go:117] "RemoveContainer" containerID="47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100393 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b"} err="failed to get container status \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": rpc error: code = NotFound desc = could not find container \"47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b\": container with ID starting with 47c7689d995c54732e8174d4ae1608c508952dd9d37a981f193830ee9b71751b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100435 4618 scope.go:117] "RemoveContainer" containerID="8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100676 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723"} err="failed to get container status \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": rpc error: code = NotFound desc = could not find container \"8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723\": container with ID starting with 8a37464a493be3e17bb9d9827f2e65302cf1b3d6e1beec02f12288ab1effb723 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100701 4618 scope.go:117] "RemoveContainer" containerID="641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100933 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4"} err="failed to get container status \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": rpc error: code = NotFound desc = could not find container \"641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4\": container with ID starting with 641a5b4bd4fed795dbf4a5504eb51a263de1702aaa94e7fc5682d3877e2ef9b4 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.100962 4618 scope.go:117] "RemoveContainer" containerID="ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101220 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b"} err="failed to get container status \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": rpc error: code = NotFound desc = could not find container \"ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b\": container with ID starting with ff077b14150b67652ffc402d300f349eea6315e8e96080f27a0bef868cda9d3b not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101247 4618 scope.go:117] "RemoveContainer" containerID="59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101477 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f"} err="failed to get container status \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": rpc error: code = NotFound desc = could not find container \"59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f\": container with ID starting with 59bb0ddaad3c8b82728cb0ef80f4f4bffb9c14411e4cccacc2fad890a9c08e2f not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101503 4618 scope.go:117] "RemoveContainer" containerID="a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101732 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0"} err="failed to get container status \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": rpc error: code = NotFound desc = could not find container \"a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0\": container with ID starting with a5dade0ec387b9f983db5e265039734b623ca19543e340c79c416598f11837a0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101757 4618 scope.go:117] "RemoveContainer" containerID="c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.101993 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0"} err="failed to get container status \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": rpc error: code = NotFound desc = could not find container \"c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0\": container with ID starting with c97b0b32475f009a0108ebdd012e6e6e8bdab88e53719f3053523de4622cdda0 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.102017 4618 scope.go:117] "RemoveContainer" containerID="95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.102241 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d"} err="failed to get container status \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": rpc error: code = NotFound desc = could not find container \"95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d\": container with ID starting with 95dc02b47b990a14953b945b00e534f8962e5ce1548b3602c3d216522a66381d not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.102262 4618 scope.go:117] "RemoveContainer" containerID="e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.102474 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702"} err="failed to get container status \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": rpc error: code = NotFound desc = could not find container \"e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702\": container with ID starting with e25c1d1f6a6c98a1b15f209edd368769b9671eed73c2e65e2e2c597895cfd702 not found: ID does not exist" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.544156 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992361e5-8eb9-426d-9eed-afffb0c30615" path="/var/lib/kubelet/pods/992361e5-8eb9-426d-9eed-afffb0c30615/volumes" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.905399 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/2.log" Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908394 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"369c5037a7e3fefd5878676c2fbc040763b90de4fe71113de9321980665217ac"} Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908434 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"941bd95c03cab61f98300e0b534e3e026cc6e54a0c562700f1b044eb53b29052"} Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908448 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"a904f4866caa27f7d4872260075da6e6481dcaa97b992684556d9cf3647565d9"} Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908457 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"0c9d854657faf7fdc75cafe6dff5bfdacddfedf10f1ebfa6a33c9145b6177540"} Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908468 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"91215ec383560f4f8f20455901658de6f5fd6614116ba1ccecac915723b4473a"} Jan 21 09:12:33 crc kubenswrapper[4618]: I0121 09:12:33.908481 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"29e31a4ee764df0b2d67112510a6faec69dbb1ab880ddcfb347ef39e9228fc6e"} Jan 21 09:12:35 crc kubenswrapper[4618]: I0121 09:12:35.919759 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"da0aba66e2a8da92388f40ece6a8d1c718b389bc4f71d424060d227815b3693b"} Jan 21 09:12:37 crc kubenswrapper[4618]: I0121 09:12:37.931134 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" event={"ID":"174633cf-9bd3-463e-b103-2b545cbb9f06","Type":"ContainerStarted","Data":"2aec278eca4661ec4ef27cb8d0d026d1a3bb7b0d4f0ed7e6af7aecf16745faf5"} Jan 21 09:12:37 crc kubenswrapper[4618]: I0121 09:12:37.931561 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:37 crc kubenswrapper[4618]: I0121 09:12:37.952634 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:37 crc kubenswrapper[4618]: I0121 09:12:37.961122 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" podStartSLOduration=5.96110988 podStartE2EDuration="5.96110988s" podCreationTimestamp="2026-01-21 09:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:12:37.956235341 +0000 UTC m=+556.706702659" watchObservedRunningTime="2026-01-21 09:12:37.96110988 +0000 UTC m=+556.711577198" Jan 21 09:12:38 crc kubenswrapper[4618]: I0121 09:12:38.934912 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:38 crc kubenswrapper[4618]: I0121 09:12:38.935230 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:38 crc kubenswrapper[4618]: I0121 09:12:38.956977 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:12:43 crc kubenswrapper[4618]: I0121 09:12:43.538097 4618 scope.go:117] "RemoveContainer" containerID="5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a" Jan 21 09:12:43 crc kubenswrapper[4618]: E0121 09:12:43.538487 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-m6jz5_openshift-multus(052a66c4-94ce-4336-93f6-1d0023e58cc4)\"" pod="openshift-multus/multus-m6jz5" podUID="052a66c4-94ce-4336-93f6-1d0023e58cc4" Jan 21 09:12:54 crc kubenswrapper[4618]: I0121 09:12:54.537332 4618 scope.go:117] "RemoveContainer" containerID="5b719c70b7e55c9d84d4fc736a13cd679032cac98e4d6fb99ff06c96e560e36a" Jan 21 09:12:55 crc kubenswrapper[4618]: I0121 09:12:55.000785 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/2.log" Jan 21 09:12:55 crc kubenswrapper[4618]: I0121 09:12:55.001003 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m6jz5" event={"ID":"052a66c4-94ce-4336-93f6-1d0023e58cc4","Type":"ContainerStarted","Data":"9a6d08e6eada6393c39646b489ccf4b79345888b6e5ed495cb13976fcb23ffea"} Jan 21 09:12:56 crc kubenswrapper[4618]: I0121 09:12:56.958611 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:12:56 crc kubenswrapper[4618]: I0121 09:12:56.958899 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.661341 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr"] Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.662517 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.663605 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.666910 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr"] Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.795262 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.795468 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.795585 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6h5k\" (UniqueName: \"kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.896993 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.897049 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.897098 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6h5k\" (UniqueName: \"kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.897693 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.897748 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.912112 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6h5k\" (UniqueName: \"kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:00 crc kubenswrapper[4618]: I0121 09:13:00.974228 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:01 crc kubenswrapper[4618]: I0121 09:13:01.298282 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr"] Jan 21 09:13:02 crc kubenswrapper[4618]: I0121 09:13:02.028925 4618 generic.go:334] "Generic (PLEG): container finished" podID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerID="d748269ed5f2bf7920a9200d87eba37713d24d85eee5cdaf1d3754161d9f8edf" exitCode=0 Jan 21 09:13:02 crc kubenswrapper[4618]: I0121 09:13:02.029019 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" event={"ID":"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd","Type":"ContainerDied","Data":"d748269ed5f2bf7920a9200d87eba37713d24d85eee5cdaf1d3754161d9f8edf"} Jan 21 09:13:02 crc kubenswrapper[4618]: I0121 09:13:02.029170 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" event={"ID":"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd","Type":"ContainerStarted","Data":"8283a8826ca4f8f66a7b591a3d24e93af4287a94641e8fa91d6c2280e2bb9a53"} Jan 21 09:13:02 crc kubenswrapper[4618]: I0121 09:13:02.680593 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lnh42" Jan 21 09:13:04 crc kubenswrapper[4618]: I0121 09:13:04.039022 4618 generic.go:334] "Generic (PLEG): container finished" podID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerID="1c0200bc1746e6c30f0874dad1fa1f2f93723f66588a5f62f340c394688dcd50" exitCode=0 Jan 21 09:13:04 crc kubenswrapper[4618]: I0121 09:13:04.039118 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" event={"ID":"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd","Type":"ContainerDied","Data":"1c0200bc1746e6c30f0874dad1fa1f2f93723f66588a5f62f340c394688dcd50"} Jan 21 09:13:05 crc kubenswrapper[4618]: I0121 09:13:05.044927 4618 generic.go:334] "Generic (PLEG): container finished" podID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerID="85ef3c70234cfe35b5e6fd12f8aead5b3c513bcda3ddae03a24dabaf37cb5123" exitCode=0 Jan 21 09:13:05 crc kubenswrapper[4618]: I0121 09:13:05.045020 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" event={"ID":"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd","Type":"ContainerDied","Data":"85ef3c70234cfe35b5e6fd12f8aead5b3c513bcda3ddae03a24dabaf37cb5123"} Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.205416 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.346449 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util\") pod \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.346531 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6h5k\" (UniqueName: \"kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k\") pod \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.346553 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle\") pod \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\" (UID: \"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd\") " Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.347028 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle" (OuterVolumeSpecName: "bundle") pod "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" (UID: "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.350555 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k" (OuterVolumeSpecName: "kube-api-access-n6h5k") pod "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" (UID: "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd"). InnerVolumeSpecName "kube-api-access-n6h5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.356203 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util" (OuterVolumeSpecName: "util") pod "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" (UID: "ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.447717 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6h5k\" (UniqueName: \"kubernetes.io/projected/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-kube-api-access-n6h5k\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.447741 4618 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:06 crc kubenswrapper[4618]: I0121 09:13:06.447749 4618 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd-util\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:07 crc kubenswrapper[4618]: I0121 09:13:07.053698 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" event={"ID":"ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd","Type":"ContainerDied","Data":"8283a8826ca4f8f66a7b591a3d24e93af4287a94641e8fa91d6c2280e2bb9a53"} Jan 21 09:13:07 crc kubenswrapper[4618]: I0121 09:13:07.053753 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8283a8826ca4f8f66a7b591a3d24e93af4287a94641e8fa91d6c2280e2bb9a53" Jan 21 09:13:07 crc kubenswrapper[4618]: I0121 09:13:07.053792 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.477736 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-dcjhc"] Jan 21 09:13:08 crc kubenswrapper[4618]: E0121 09:13:08.478124 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="util" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.478135 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="util" Jan 21 09:13:08 crc kubenswrapper[4618]: E0121 09:13:08.478189 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="pull" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.478195 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="pull" Jan 21 09:13:08 crc kubenswrapper[4618]: E0121 09:13:08.478206 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="extract" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.478212 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="extract" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.478307 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd" containerName="extract" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.478622 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.479744 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-kkk77" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.480165 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.483707 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.486423 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-dcjhc"] Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.669407 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knpxm\" (UniqueName: \"kubernetes.io/projected/80022532-8c85-41c8-8c65-a67f28411a13-kube-api-access-knpxm\") pod \"nmstate-operator-646758c888-dcjhc\" (UID: \"80022532-8c85-41c8-8c65-a67f28411a13\") " pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.770179 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knpxm\" (UniqueName: \"kubernetes.io/projected/80022532-8c85-41c8-8c65-a67f28411a13-kube-api-access-knpxm\") pod \"nmstate-operator-646758c888-dcjhc\" (UID: \"80022532-8c85-41c8-8c65-a67f28411a13\") " pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.786880 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knpxm\" (UniqueName: \"kubernetes.io/projected/80022532-8c85-41c8-8c65-a67f28411a13-kube-api-access-knpxm\") pod \"nmstate-operator-646758c888-dcjhc\" (UID: \"80022532-8c85-41c8-8c65-a67f28411a13\") " pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" Jan 21 09:13:08 crc kubenswrapper[4618]: I0121 09:13:08.789512 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" Jan 21 09:13:09 crc kubenswrapper[4618]: I0121 09:13:09.125836 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-dcjhc"] Jan 21 09:13:09 crc kubenswrapper[4618]: W0121 09:13:09.128806 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80022532_8c85_41c8_8c65_a67f28411a13.slice/crio-4d5e5ad104c3ff97f1db2eb5fbf7f37491c0b448fc1082d2a28b0c3c1b0aae2a WatchSource:0}: Error finding container 4d5e5ad104c3ff97f1db2eb5fbf7f37491c0b448fc1082d2a28b0c3c1b0aae2a: Status 404 returned error can't find the container with id 4d5e5ad104c3ff97f1db2eb5fbf7f37491c0b448fc1082d2a28b0c3c1b0aae2a Jan 21 09:13:10 crc kubenswrapper[4618]: I0121 09:13:10.066496 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" event={"ID":"80022532-8c85-41c8-8c65-a67f28411a13","Type":"ContainerStarted","Data":"4d5e5ad104c3ff97f1db2eb5fbf7f37491c0b448fc1082d2a28b0c3c1b0aae2a"} Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.071382 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" event={"ID":"80022532-8c85-41c8-8c65-a67f28411a13","Type":"ContainerStarted","Data":"01161781765e336f308004b1005523b8906cb019d8f6f06170434ced42627992"} Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.080967 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-dcjhc" podStartSLOduration=1.468437445 podStartE2EDuration="3.080955265s" podCreationTimestamp="2026-01-21 09:13:08 +0000 UTC" firstStartedPulling="2026-01-21 09:13:09.13078697 +0000 UTC m=+587.881254287" lastFinishedPulling="2026-01-21 09:13:10.74330479 +0000 UTC m=+589.493772107" observedRunningTime="2026-01-21 09:13:11.07994669 +0000 UTC m=+589.830414007" watchObservedRunningTime="2026-01-21 09:13:11.080955265 +0000 UTC m=+589.831422582" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.805335 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-8r4qk"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.806020 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.807914 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hxrp7" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.811498 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.812074 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.813527 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.821746 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-8r4qk"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.827955 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fdzmd"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.828616 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.830792 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908186 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv"] Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908423 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-dbus-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908473 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908497 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-nmstate-lock\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908632 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmd2\" (UniqueName: \"kubernetes.io/projected/822b5ec2-ecb3-459a-8445-6722cc28e866-kube-api-access-dhmd2\") pod \"nmstate-metrics-54757c584b-8r4qk\" (UID: \"822b5ec2-ecb3-459a-8445-6722cc28e866\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908682 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-ovs-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908712 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908725 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvcn\" (UniqueName: \"kubernetes.io/projected/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-kube-api-access-rjvcn\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.908813 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86zn5\" (UniqueName: \"kubernetes.io/projected/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-kube-api-access-86zn5\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.915762 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.915842 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gcqwv" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.915864 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 21 09:13:11 crc kubenswrapper[4618]: I0121 09:13:11.920277 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv"] Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009500 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-ovs-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009532 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvcn\" (UniqueName: \"kubernetes.io/projected/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-kube-api-access-rjvcn\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009558 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86zn5\" (UniqueName: \"kubernetes.io/projected/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-kube-api-access-86zn5\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009591 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fc037d-6b85-473a-bd03-3a266430e4e2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009609 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-dbus-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009614 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-ovs-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009635 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5t6\" (UniqueName: \"kubernetes.io/projected/d7fc037d-6b85-473a-bd03-3a266430e4e2-kube-api-access-fr5t6\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009659 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009677 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-nmstate-lock\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009692 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d7fc037d-6b85-473a-bd03-3a266430e4e2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009724 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmd2\" (UniqueName: \"kubernetes.io/projected/822b5ec2-ecb3-459a-8445-6722cc28e866-kube-api-access-dhmd2\") pod \"nmstate-metrics-54757c584b-8r4qk\" (UID: \"822b5ec2-ecb3-459a-8445-6722cc28e866\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" Jan 21 09:13:12 crc kubenswrapper[4618]: E0121 09:13:12.009930 4618 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.009961 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-nmstate-lock\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: E0121 09:13:12.009980 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair podName:71e9ce01-3713-4cf6-a76e-ad21ac16e10e nodeName:}" failed. No retries permitted until 2026-01-21 09:13:12.509967607 +0000 UTC m=+591.260434923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-lrckd" (UID: "71e9ce01-3713-4cf6-a76e-ad21ac16e10e") : secret "openshift-nmstate-webhook" not found Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.010281 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-dbus-socket\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.027571 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmd2\" (UniqueName: \"kubernetes.io/projected/822b5ec2-ecb3-459a-8445-6722cc28e866-kube-api-access-dhmd2\") pod \"nmstate-metrics-54757c584b-8r4qk\" (UID: \"822b5ec2-ecb3-459a-8445-6722cc28e866\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.027752 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86zn5\" (UniqueName: \"kubernetes.io/projected/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-kube-api-access-86zn5\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.033845 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvcn\" (UniqueName: \"kubernetes.io/projected/a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2-kube-api-access-rjvcn\") pod \"nmstate-handler-fdzmd\" (UID: \"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2\") " pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.080256 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b44546b8-kqhdq"] Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.080770 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.089672 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b44546b8-kqhdq"] Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111031 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-trusted-ca-bundle\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111073 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-oauth-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111190 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fc037d-6b85-473a-bd03-3a266430e4e2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111272 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5t6\" (UniqueName: \"kubernetes.io/projected/d7fc037d-6b85-473a-bd03-3a266430e4e2-kube-api-access-fr5t6\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111359 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d7fc037d-6b85-473a-bd03-3a266430e4e2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111436 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-service-ca\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111465 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qbp\" (UniqueName: \"kubernetes.io/projected/6412923e-5d51-4bb4-8350-2189d7e42b1e-kube-api-access-55qbp\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111574 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111639 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-oauth-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.111662 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.112092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d7fc037d-6b85-473a-bd03-3a266430e4e2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.114840 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7fc037d-6b85-473a-bd03-3a266430e4e2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.118325 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.125945 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5t6\" (UniqueName: \"kubernetes.io/projected/d7fc037d-6b85-473a-bd03-3a266430e4e2-kube-api-access-fr5t6\") pod \"nmstate-console-plugin-7754f76f8b-sqxmv\" (UID: \"d7fc037d-6b85-473a-bd03-3a266430e4e2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.139401 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.212973 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-oauth-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.213031 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.213085 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-trusted-ca-bundle\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.213274 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-oauth-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214072 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214154 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-oauth-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214171 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-service-ca\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214185 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-trusted-ca-bundle\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214215 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qbp\" (UniqueName: \"kubernetes.io/projected/6412923e-5d51-4bb4-8350-2189d7e42b1e-kube-api-access-55qbp\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214259 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.214861 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6412923e-5d51-4bb4-8350-2189d7e42b1e-service-ca\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.215463 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-oauth-config\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.216674 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6412923e-5d51-4bb4-8350-2189d7e42b1e-console-serving-cert\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.218659 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.227070 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qbp\" (UniqueName: \"kubernetes.io/projected/6412923e-5d51-4bb4-8350-2189d7e42b1e-kube-api-access-55qbp\") pod \"console-77b44546b8-kqhdq\" (UID: \"6412923e-5d51-4bb4-8350-2189d7e42b1e\") " pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.250441 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-8r4qk"] Jan 21 09:13:12 crc kubenswrapper[4618]: W0121 09:13:12.252516 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822b5ec2_ecb3_459a_8445_6722cc28e866.slice/crio-483db62679e849d047273040c136063e1ce5f7da8a59fe2ca0bff3e5c6184912 WatchSource:0}: Error finding container 483db62679e849d047273040c136063e1ce5f7da8a59fe2ca0bff3e5c6184912: Status 404 returned error can't find the container with id 483db62679e849d047273040c136063e1ce5f7da8a59fe2ca0bff3e5c6184912 Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.392684 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.517553 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.520177 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71e9ce01-3713-4cf6-a76e-ad21ac16e10e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lrckd\" (UID: \"71e9ce01-3713-4cf6-a76e-ad21ac16e10e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.549947 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv"] Jan 21 09:13:12 crc kubenswrapper[4618]: W0121 09:13:12.553831 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7fc037d_6b85_473a_bd03_3a266430e4e2.slice/crio-ee6bf8e24868717acebec8727e19e92fb78b723d3b617b30c63cc89bdb26c122 WatchSource:0}: Error finding container ee6bf8e24868717acebec8727e19e92fb78b723d3b617b30c63cc89bdb26c122: Status 404 returned error can't find the container with id ee6bf8e24868717acebec8727e19e92fb78b723d3b617b30c63cc89bdb26c122 Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.723416 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:12 crc kubenswrapper[4618]: I0121 09:13:12.736853 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b44546b8-kqhdq"] Jan 21 09:13:12 crc kubenswrapper[4618]: W0121 09:13:12.740106 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6412923e_5d51_4bb4_8350_2189d7e42b1e.slice/crio-8456f39d70c814d02d3306cfbe1247705c7b4f8ffdbe4be52397e65976dd4dc5 WatchSource:0}: Error finding container 8456f39d70c814d02d3306cfbe1247705c7b4f8ffdbe4be52397e65976dd4dc5: Status 404 returned error can't find the container with id 8456f39d70c814d02d3306cfbe1247705c7b4f8ffdbe4be52397e65976dd4dc5 Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.047432 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd"] Jan 21 09:13:13 crc kubenswrapper[4618]: W0121 09:13:13.051758 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e9ce01_3713_4cf6_a76e_ad21ac16e10e.slice/crio-16a9d556f9714284a71f3f1933fddb43d6ef0ec8b4efda316e4781e7e5c7c0d1 WatchSource:0}: Error finding container 16a9d556f9714284a71f3f1933fddb43d6ef0ec8b4efda316e4781e7e5c7c0d1: Status 404 returned error can't find the container with id 16a9d556f9714284a71f3f1933fddb43d6ef0ec8b4efda316e4781e7e5c7c0d1 Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.078966 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" event={"ID":"822b5ec2-ecb3-459a-8445-6722cc28e866","Type":"ContainerStarted","Data":"483db62679e849d047273040c136063e1ce5f7da8a59fe2ca0bff3e5c6184912"} Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.079834 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" event={"ID":"d7fc037d-6b85-473a-bd03-3a266430e4e2","Type":"ContainerStarted","Data":"ee6bf8e24868717acebec8727e19e92fb78b723d3b617b30c63cc89bdb26c122"} Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.080607 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" event={"ID":"71e9ce01-3713-4cf6-a76e-ad21ac16e10e","Type":"ContainerStarted","Data":"16a9d556f9714284a71f3f1933fddb43d6ef0ec8b4efda316e4781e7e5c7c0d1"} Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.081333 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fdzmd" event={"ID":"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2","Type":"ContainerStarted","Data":"e8a0600a6d22d65a0d05b9c1bc69f9b7b96b2dcd2c5bbdba585441e4754c459a"} Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.082400 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b44546b8-kqhdq" event={"ID":"6412923e-5d51-4bb4-8350-2189d7e42b1e","Type":"ContainerStarted","Data":"d7bf4058e4e5bfae14cb8e4d2f10180ff3df358d57d95beb805f3e0647912456"} Jan 21 09:13:13 crc kubenswrapper[4618]: I0121 09:13:13.082424 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b44546b8-kqhdq" event={"ID":"6412923e-5d51-4bb4-8350-2189d7e42b1e","Type":"ContainerStarted","Data":"8456f39d70c814d02d3306cfbe1247705c7b4f8ffdbe4be52397e65976dd4dc5"} Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.092698 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" event={"ID":"822b5ec2-ecb3-459a-8445-6722cc28e866","Type":"ContainerStarted","Data":"8d600370b9e87c9d20e842fc87d7580b855fb26f94a6043f58be4f413faaa7f2"} Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.094516 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" event={"ID":"d7fc037d-6b85-473a-bd03-3a266430e4e2","Type":"ContainerStarted","Data":"8926d701189d178d89c2c9d7ad1b9bfb3db318760e1b4476791f297f41915a0f"} Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.096084 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" event={"ID":"71e9ce01-3713-4cf6-a76e-ad21ac16e10e","Type":"ContainerStarted","Data":"00b694d8a581dae5f89d96b414005642adc1bce3994d8691677adef9de5be3e6"} Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.096277 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.097258 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fdzmd" event={"ID":"a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2","Type":"ContainerStarted","Data":"d76c38e9c3844dd7ebeefdca854930ca05e84e8e2184842c0efe27417a2a370d"} Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.097925 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.109947 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b44546b8-kqhdq" podStartSLOduration=3.109932072 podStartE2EDuration="3.109932072s" podCreationTimestamp="2026-01-21 09:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:13:13.09338689 +0000 UTC m=+591.843854197" watchObservedRunningTime="2026-01-21 09:13:15.109932072 +0000 UTC m=+593.860399389" Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.110218 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sqxmv" podStartSLOduration=1.9453743270000001 podStartE2EDuration="4.11021374s" podCreationTimestamp="2026-01-21 09:13:11 +0000 UTC" firstStartedPulling="2026-01-21 09:13:12.55632662 +0000 UTC m=+591.306793937" lastFinishedPulling="2026-01-21 09:13:14.721166033 +0000 UTC m=+593.471633350" observedRunningTime="2026-01-21 09:13:15.10797194 +0000 UTC m=+593.858439257" watchObservedRunningTime="2026-01-21 09:13:15.11021374 +0000 UTC m=+593.860681058" Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.121937 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fdzmd" podStartSLOduration=1.5524298029999999 podStartE2EDuration="4.121561795s" podCreationTimestamp="2026-01-21 09:13:11 +0000 UTC" firstStartedPulling="2026-01-21 09:13:12.156100659 +0000 UTC m=+590.906567976" lastFinishedPulling="2026-01-21 09:13:14.725232652 +0000 UTC m=+593.475699968" observedRunningTime="2026-01-21 09:13:15.11780573 +0000 UTC m=+593.868273046" watchObservedRunningTime="2026-01-21 09:13:15.121561795 +0000 UTC m=+593.872029112" Jan 21 09:13:15 crc kubenswrapper[4618]: I0121 09:13:15.130432 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" podStartSLOduration=2.462113011 podStartE2EDuration="4.130415894s" podCreationTimestamp="2026-01-21 09:13:11 +0000 UTC" firstStartedPulling="2026-01-21 09:13:13.053132576 +0000 UTC m=+591.803599893" lastFinishedPulling="2026-01-21 09:13:14.721435459 +0000 UTC m=+593.471902776" observedRunningTime="2026-01-21 09:13:15.128368007 +0000 UTC m=+593.878835323" watchObservedRunningTime="2026-01-21 09:13:15.130415894 +0000 UTC m=+593.880883210" Jan 21 09:13:17 crc kubenswrapper[4618]: I0121 09:13:17.110178 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" event={"ID":"822b5ec2-ecb3-459a-8445-6722cc28e866","Type":"ContainerStarted","Data":"329256074c5e3e9b9c81a0c0b452f3aaa315da68762ec2e79bf289627552a44f"} Jan 21 09:13:17 crc kubenswrapper[4618]: I0121 09:13:17.131003 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-8r4qk" podStartSLOduration=1.953967838 podStartE2EDuration="6.130987103s" podCreationTimestamp="2026-01-21 09:13:11 +0000 UTC" firstStartedPulling="2026-01-21 09:13:12.254044894 +0000 UTC m=+591.004512212" lastFinishedPulling="2026-01-21 09:13:16.43106416 +0000 UTC m=+595.181531477" observedRunningTime="2026-01-21 09:13:17.128266824 +0000 UTC m=+595.878734141" watchObservedRunningTime="2026-01-21 09:13:17.130987103 +0000 UTC m=+595.881454420" Jan 21 09:13:22 crc kubenswrapper[4618]: I0121 09:13:22.157104 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fdzmd" Jan 21 09:13:22 crc kubenswrapper[4618]: I0121 09:13:22.392958 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:22 crc kubenswrapper[4618]: I0121 09:13:22.393002 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:22 crc kubenswrapper[4618]: I0121 09:13:22.396626 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:23 crc kubenswrapper[4618]: I0121 09:13:23.134599 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b44546b8-kqhdq" Jan 21 09:13:23 crc kubenswrapper[4618]: I0121 09:13:23.163456 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:13:26 crc kubenswrapper[4618]: I0121 09:13:26.959222 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:13:26 crc kubenswrapper[4618]: I0121 09:13:26.959484 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:13:32 crc kubenswrapper[4618]: I0121 09:13:32.728405 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lrckd" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.013385 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82"] Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.015371 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.016780 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.020359 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82"] Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.193212 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.193477 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbgf\" (UniqueName: \"kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.193522 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.294367 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.294452 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.294488 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbgf\" (UniqueName: \"kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.294952 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.295034 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.309371 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbgf\" (UniqueName: \"kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.330993 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:41 crc kubenswrapper[4618]: I0121 09:13:41.662995 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82"] Jan 21 09:13:42 crc kubenswrapper[4618]: I0121 09:13:42.218857 4618 generic.go:334] "Generic (PLEG): container finished" podID="56473d23-b169-4791-a419-71d0ddf89139" containerID="c229decfdec86b6557b0f74af7b340ed61f09164e1f047fcf9db12ae377cf684" exitCode=0 Jan 21 09:13:42 crc kubenswrapper[4618]: I0121 09:13:42.218954 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" event={"ID":"56473d23-b169-4791-a419-71d0ddf89139","Type":"ContainerDied","Data":"c229decfdec86b6557b0f74af7b340ed61f09164e1f047fcf9db12ae377cf684"} Jan 21 09:13:42 crc kubenswrapper[4618]: I0121 09:13:42.219085 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" event={"ID":"56473d23-b169-4791-a419-71d0ddf89139","Type":"ContainerStarted","Data":"dc4b0319b60973a4427149f513b7f514432d70be7fe1dbca0fdfe6fa014c71d1"} Jan 21 09:13:44 crc kubenswrapper[4618]: I0121 09:13:44.227491 4618 generic.go:334] "Generic (PLEG): container finished" podID="56473d23-b169-4791-a419-71d0ddf89139" containerID="2977847a58fedc982c9e00a1b3f60af4245e8b1e62f527e0dd15545ee884e442" exitCode=0 Jan 21 09:13:44 crc kubenswrapper[4618]: I0121 09:13:44.227585 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" event={"ID":"56473d23-b169-4791-a419-71d0ddf89139","Type":"ContainerDied","Data":"2977847a58fedc982c9e00a1b3f60af4245e8b1e62f527e0dd15545ee884e442"} Jan 21 09:13:45 crc kubenswrapper[4618]: I0121 09:13:45.232646 4618 generic.go:334] "Generic (PLEG): container finished" podID="56473d23-b169-4791-a419-71d0ddf89139" containerID="ad3e2e7388ccb092d11cf71b538cf6b69a55bc0854682a7a567ea956363afe50" exitCode=0 Jan 21 09:13:45 crc kubenswrapper[4618]: I0121 09:13:45.232790 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" event={"ID":"56473d23-b169-4791-a419-71d0ddf89139","Type":"ContainerDied","Data":"ad3e2e7388ccb092d11cf71b538cf6b69a55bc0854682a7a567ea956363afe50"} Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.384944 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.544712 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsbgf\" (UniqueName: \"kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf\") pod \"56473d23-b169-4791-a419-71d0ddf89139\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.544982 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util\") pod \"56473d23-b169-4791-a419-71d0ddf89139\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.545155 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle\") pod \"56473d23-b169-4791-a419-71d0ddf89139\" (UID: \"56473d23-b169-4791-a419-71d0ddf89139\") " Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.545842 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle" (OuterVolumeSpecName: "bundle") pod "56473d23-b169-4791-a419-71d0ddf89139" (UID: "56473d23-b169-4791-a419-71d0ddf89139"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.548975 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf" (OuterVolumeSpecName: "kube-api-access-dsbgf") pod "56473d23-b169-4791-a419-71d0ddf89139" (UID: "56473d23-b169-4791-a419-71d0ddf89139"). InnerVolumeSpecName "kube-api-access-dsbgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.554847 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util" (OuterVolumeSpecName: "util") pod "56473d23-b169-4791-a419-71d0ddf89139" (UID: "56473d23-b169-4791-a419-71d0ddf89139"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.645866 4618 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.645895 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsbgf\" (UniqueName: \"kubernetes.io/projected/56473d23-b169-4791-a419-71d0ddf89139-kube-api-access-dsbgf\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:46 crc kubenswrapper[4618]: I0121 09:13:46.645905 4618 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56473d23-b169-4791-a419-71d0ddf89139-util\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:47 crc kubenswrapper[4618]: I0121 09:13:47.241693 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" event={"ID":"56473d23-b169-4791-a419-71d0ddf89139","Type":"ContainerDied","Data":"dc4b0319b60973a4427149f513b7f514432d70be7fe1dbca0fdfe6fa014c71d1"} Jan 21 09:13:47 crc kubenswrapper[4618]: I0121 09:13:47.241726 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4b0319b60973a4427149f513b7f514432d70be7fe1dbca0fdfe6fa014c71d1" Jan 21 09:13:47 crc kubenswrapper[4618]: I0121 09:13:47.241733 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.187118 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dd2fv" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" containerID="cri-o://05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f" gracePeriod=15 Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.479776 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dd2fv_8da3ae7d-2af2-436f-85e8-542ae6eab03b/console/0.log" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.479833 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563545 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563597 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563633 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563657 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563680 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.563741 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlsn8\" (UniqueName: \"kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.564168 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.564203 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config" (OuterVolumeSpecName: "console-config") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.564218 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.564303 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca" (OuterVolumeSpecName: "service-ca") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.571778 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8" (OuterVolumeSpecName: "kube-api-access-wlsn8") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "kube-api-access-wlsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.571799 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664345 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert\") pod \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\" (UID: \"8da3ae7d-2af2-436f-85e8-542ae6eab03b\") " Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664712 4618 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664728 4618 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664738 4618 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664750 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlsn8\" (UniqueName: \"kubernetes.io/projected/8da3ae7d-2af2-436f-85e8-542ae6eab03b-kube-api-access-wlsn8\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664760 4618 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.664767 4618 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.668494 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8da3ae7d-2af2-436f-85e8-542ae6eab03b" (UID: "8da3ae7d-2af2-436f-85e8-542ae6eab03b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:13:48 crc kubenswrapper[4618]: I0121 09:13:48.765867 4618 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8da3ae7d-2af2-436f-85e8-542ae6eab03b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.249974 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dd2fv_8da3ae7d-2af2-436f-85e8-542ae6eab03b/console/0.log" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.250031 4618 generic.go:334] "Generic (PLEG): container finished" podID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerID="05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f" exitCode=2 Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.250057 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dd2fv" event={"ID":"8da3ae7d-2af2-436f-85e8-542ae6eab03b","Type":"ContainerDied","Data":"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f"} Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.250086 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dd2fv" event={"ID":"8da3ae7d-2af2-436f-85e8-542ae6eab03b","Type":"ContainerDied","Data":"bd3be72cf5180a54be071d83b1708ed1bf759b47d69699af4220900090cf9a10"} Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.250095 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dd2fv" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.250100 4618 scope.go:117] "RemoveContainer" containerID="05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.263984 4618 scope.go:117] "RemoveContainer" containerID="05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f" Jan 21 09:13:49 crc kubenswrapper[4618]: E0121 09:13:49.264323 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f\": container with ID starting with 05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f not found: ID does not exist" containerID="05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.264357 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f"} err="failed to get container status \"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f\": rpc error: code = NotFound desc = could not find container \"05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f\": container with ID starting with 05e5b357113db214a2ea5a30a4de0ab92a016f53250837bb3c8f7a380935e28f not found: ID does not exist" Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.272186 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.279357 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dd2fv"] Jan 21 09:13:49 crc kubenswrapper[4618]: I0121 09:13:49.545351 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" path="/var/lib/kubelet/pods/8da3ae7d-2af2-436f-85e8-542ae6eab03b/volumes" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.455824 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8"] Jan 21 09:13:55 crc kubenswrapper[4618]: E0121 09:13:55.456224 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="extract" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456237 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="extract" Jan 21 09:13:55 crc kubenswrapper[4618]: E0121 09:13:55.456247 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="pull" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456252 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="pull" Jan 21 09:13:55 crc kubenswrapper[4618]: E0121 09:13:55.456260 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456266 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" Jan 21 09:13:55 crc kubenswrapper[4618]: E0121 09:13:55.456273 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="util" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456278 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="util" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456366 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="56473d23-b169-4791-a419-71d0ddf89139" containerName="extract" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456375 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da3ae7d-2af2-436f-85e8-542ae6eab03b" containerName="console" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.456731 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.458379 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.458546 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.458726 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7hsgq" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.459034 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.459122 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.472444 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8"] Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.635724 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-webhook-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.635820 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbdnq\" (UniqueName: \"kubernetes.io/projected/4b0325f8-aa62-451f-84b7-9f393225ff9d-kube-api-access-kbdnq\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.635898 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-apiservice-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.681356 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm"] Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.681970 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.683618 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.683730 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.685420 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9zfrx" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.693035 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm"] Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.736970 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-webhook-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.737039 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbdnq\" (UniqueName: \"kubernetes.io/projected/4b0325f8-aa62-451f-84b7-9f393225ff9d-kube-api-access-kbdnq\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.737286 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-apiservice-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.741668 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-webhook-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.743559 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b0325f8-aa62-451f-84b7-9f393225ff9d-apiservice-cert\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.749997 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbdnq\" (UniqueName: \"kubernetes.io/projected/4b0325f8-aa62-451f-84b7-9f393225ff9d-kube-api-access-kbdnq\") pod \"metallb-operator-controller-manager-656ff8bd-4klk8\" (UID: \"4b0325f8-aa62-451f-84b7-9f393225ff9d\") " pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.769739 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.840440 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xs6j\" (UniqueName: \"kubernetes.io/projected/ecb8ccb1-678b-4dd5-be5e-8296b9305053-kube-api-access-8xs6j\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.840655 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-webhook-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.840674 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-apiservice-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.941829 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-webhook-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.941871 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-apiservice-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.941968 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xs6j\" (UniqueName: \"kubernetes.io/projected/ecb8ccb1-678b-4dd5-be5e-8296b9305053-kube-api-access-8xs6j\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.946572 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-apiservice-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.946615 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecb8ccb1-678b-4dd5-be5e-8296b9305053-webhook-cert\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.953982 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xs6j\" (UniqueName: \"kubernetes.io/projected/ecb8ccb1-678b-4dd5-be5e-8296b9305053-kube-api-access-8xs6j\") pod \"metallb-operator-webhook-server-8485b999df-6fwkm\" (UID: \"ecb8ccb1-678b-4dd5-be5e-8296b9305053\") " pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:55 crc kubenswrapper[4618]: I0121 09:13:55.992886 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.156839 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8"] Jan 21 09:13:56 crc kubenswrapper[4618]: W0121 09:13:56.160589 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0325f8_aa62_451f_84b7_9f393225ff9d.slice/crio-c5a9846baa546edc5fba6466c247cd0991e1304f8e710350c3f27f6ffa6a4178 WatchSource:0}: Error finding container c5a9846baa546edc5fba6466c247cd0991e1304f8e710350c3f27f6ffa6a4178: Status 404 returned error can't find the container with id c5a9846baa546edc5fba6466c247cd0991e1304f8e710350c3f27f6ffa6a4178 Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.167953 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm"] Jan 21 09:13:56 crc kubenswrapper[4618]: W0121 09:13:56.169314 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecb8ccb1_678b_4dd5_be5e_8296b9305053.slice/crio-381fdf819fde419509230cf2bc7dba6ceaec35eda493873d3a8dfcb08f3463cb WatchSource:0}: Error finding container 381fdf819fde419509230cf2bc7dba6ceaec35eda493873d3a8dfcb08f3463cb: Status 404 returned error can't find the container with id 381fdf819fde419509230cf2bc7dba6ceaec35eda493873d3a8dfcb08f3463cb Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.277830 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" event={"ID":"ecb8ccb1-678b-4dd5-be5e-8296b9305053","Type":"ContainerStarted","Data":"381fdf819fde419509230cf2bc7dba6ceaec35eda493873d3a8dfcb08f3463cb"} Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.278724 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" event={"ID":"4b0325f8-aa62-451f-84b7-9f393225ff9d","Type":"ContainerStarted","Data":"c5a9846baa546edc5fba6466c247cd0991e1304f8e710350c3f27f6ffa6a4178"} Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.959064 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.959354 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.959412 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.959815 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:13:56 crc kubenswrapper[4618]: I0121 09:13:56.959859 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23" gracePeriod=600 Jan 21 09:13:57 crc kubenswrapper[4618]: I0121 09:13:57.285022 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23" exitCode=0 Jan 21 09:13:57 crc kubenswrapper[4618]: I0121 09:13:57.285068 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23"} Jan 21 09:13:57 crc kubenswrapper[4618]: I0121 09:13:57.285261 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb"} Jan 21 09:13:57 crc kubenswrapper[4618]: I0121 09:13:57.285282 4618 scope.go:117] "RemoveContainer" containerID="db8abbeb18512486c7d5cdd62c22db63be4c62bacb9e602c0b05fd7df41ce206" Jan 21 09:13:59 crc kubenswrapper[4618]: I0121 09:13:59.296684 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" event={"ID":"4b0325f8-aa62-451f-84b7-9f393225ff9d","Type":"ContainerStarted","Data":"d980e2d9f32df262a5e3276f374bdc85170b38974e86fa7e2fabd6364b01b847"} Jan 21 09:13:59 crc kubenswrapper[4618]: I0121 09:13:59.297206 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:13:59 crc kubenswrapper[4618]: I0121 09:13:59.313342 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" podStartSLOduration=1.972593891 podStartE2EDuration="4.31332673s" podCreationTimestamp="2026-01-21 09:13:55 +0000 UTC" firstStartedPulling="2026-01-21 09:13:56.162051626 +0000 UTC m=+634.912518943" lastFinishedPulling="2026-01-21 09:13:58.502784466 +0000 UTC m=+637.253251782" observedRunningTime="2026-01-21 09:13:59.31169239 +0000 UTC m=+638.062159708" watchObservedRunningTime="2026-01-21 09:13:59.31332673 +0000 UTC m=+638.063794047" Jan 21 09:14:00 crc kubenswrapper[4618]: I0121 09:14:00.302530 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" event={"ID":"ecb8ccb1-678b-4dd5-be5e-8296b9305053","Type":"ContainerStarted","Data":"aa35ccf53fd4d6ac62d7f4dd342fdc9e8e80625c23083fd61e118fb7443f9b99"} Jan 21 09:14:00 crc kubenswrapper[4618]: I0121 09:14:00.316533 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" podStartSLOduration=1.947183074 podStartE2EDuration="5.316519976s" podCreationTimestamp="2026-01-21 09:13:55 +0000 UTC" firstStartedPulling="2026-01-21 09:13:56.171411756 +0000 UTC m=+634.921879072" lastFinishedPulling="2026-01-21 09:13:59.540748657 +0000 UTC m=+638.291215974" observedRunningTime="2026-01-21 09:14:00.314350211 +0000 UTC m=+639.064817528" watchObservedRunningTime="2026-01-21 09:14:00.316519976 +0000 UTC m=+639.066987293" Jan 21 09:14:01 crc kubenswrapper[4618]: I0121 09:14:01.311474 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:14:15 crc kubenswrapper[4618]: I0121 09:14:15.996531 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8485b999df-6fwkm" Jan 21 09:14:35 crc kubenswrapper[4618]: I0121 09:14:35.772641 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-656ff8bd-4klk8" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.307901 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fljjn"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.312653 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.313058 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.313389 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.315236 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.315483 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.315623 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qrczj" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.315722 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.325209 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.360639 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bxvc2"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.361408 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.362496 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.362985 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ntlmb" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363025 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363592 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-metrics-certs\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363624 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/305963d0-7d19-440d-ba24-c836947123ab-metrics-certs\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363644 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/305963d0-7d19-440d-ba24-c836947123ab-frr-startup\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363663 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5scb\" (UniqueName: \"kubernetes.io/projected/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-kube-api-access-k5scb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363698 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt25m\" (UniqueName: \"kubernetes.io/projected/5acf067e-b50e-4176-8d97-18188382659a-kube-api-access-xt25m\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363721 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-conf\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363735 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363754 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-sockets\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363769 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-metrics\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363782 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363797 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mgrp\" (UniqueName: \"kubernetes.io/projected/305963d0-7d19-440d-ba24-c836947123ab-kube-api-access-6mgrp\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363809 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5acf067e-b50e-4176-8d97-18188382659a-metallb-excludel2\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.363823 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-reloader\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.364245 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.380408 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-gn7q5"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.396115 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.399663 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.400050 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-gn7q5"] Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.464921 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-metrics-certs\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.464962 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/305963d0-7d19-440d-ba24-c836947123ab-metrics-certs\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.464985 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/305963d0-7d19-440d-ba24-c836947123ab-frr-startup\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465194 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5scb\" (UniqueName: \"kubernetes.io/projected/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-kube-api-access-k5scb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465257 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt25m\" (UniqueName: \"kubernetes.io/projected/5acf067e-b50e-4176-8d97-18188382659a-kube-api-access-xt25m\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465296 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-conf\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465317 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465341 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-sockets\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465358 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-metrics\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465373 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465390 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mgrp\" (UniqueName: \"kubernetes.io/projected/305963d0-7d19-440d-ba24-c836947123ab-kube-api-access-6mgrp\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465405 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5acf067e-b50e-4176-8d97-18188382659a-metallb-excludel2\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465418 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-reloader\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465687 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-reloader\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.465755 4618 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.466012 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/305963d0-7d19-440d-ba24-c836947123ab-frr-startup\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.466037 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist podName:5acf067e-b50e-4176-8d97-18188382659a nodeName:}" failed. No retries permitted until 2026-01-21 09:14:36.966022326 +0000 UTC m=+675.716489643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist") pod "speaker-bxvc2" (UID: "5acf067e-b50e-4176-8d97-18188382659a") : secret "metallb-memberlist" not found Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.465929 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-sockets\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.466053 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-metrics\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.465918 4618 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.466107 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert podName:0b1f4460-bb9d-4f03-a4bd-57e0a5f79669 nodeName:}" failed. No retries permitted until 2026-01-21 09:14:36.966086146 +0000 UTC m=+675.716553464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert") pod "frr-k8s-webhook-server-7df86c4f6c-2l8f6" (UID: "0b1f4460-bb9d-4f03-a4bd-57e0a5f79669") : secret "frr-k8s-webhook-server-cert" not found Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.466193 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/305963d0-7d19-440d-ba24-c836947123ab-frr-conf\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.466366 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5acf067e-b50e-4176-8d97-18188382659a-metallb-excludel2\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.470662 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-metrics-certs\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.470679 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/305963d0-7d19-440d-ba24-c836947123ab-metrics-certs\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.478862 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5scb\" (UniqueName: \"kubernetes.io/projected/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-kube-api-access-k5scb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.479420 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mgrp\" (UniqueName: \"kubernetes.io/projected/305963d0-7d19-440d-ba24-c836947123ab-kube-api-access-6mgrp\") pod \"frr-k8s-fljjn\" (UID: \"305963d0-7d19-440d-ba24-c836947123ab\") " pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.487065 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt25m\" (UniqueName: \"kubernetes.io/projected/5acf067e-b50e-4176-8d97-18188382659a-kube-api-access-xt25m\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.566927 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzm4p\" (UniqueName: \"kubernetes.io/projected/3754650d-5a51-4b01-98e7-2575b5212346-kube-api-access-xzm4p\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.566974 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-metrics-certs\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.567085 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-cert\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.636124 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.668165 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-cert\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.668237 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzm4p\" (UniqueName: \"kubernetes.io/projected/3754650d-5a51-4b01-98e7-2575b5212346-kube-api-access-xzm4p\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.668267 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-metrics-certs\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.669864 4618 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.670945 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-metrics-certs\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.680815 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzm4p\" (UniqueName: \"kubernetes.io/projected/3754650d-5a51-4b01-98e7-2575b5212346-kube-api-access-xzm4p\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.680945 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3754650d-5a51-4b01-98e7-2575b5212346-cert\") pod \"controller-6968d8fdc4-gn7q5\" (UID: \"3754650d-5a51-4b01-98e7-2575b5212346\") " pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.711094 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.970645 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.970887 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.971067 4618 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 09:14:36 crc kubenswrapper[4618]: E0121 09:14:36.971133 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist podName:5acf067e-b50e-4176-8d97-18188382659a nodeName:}" failed. No retries permitted until 2026-01-21 09:14:37.971118678 +0000 UTC m=+676.721585996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist") pod "speaker-bxvc2" (UID: "5acf067e-b50e-4176-8d97-18188382659a") : secret "metallb-memberlist" not found Jan 21 09:14:36 crc kubenswrapper[4618]: I0121 09:14:36.973734 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b1f4460-bb9d-4f03-a4bd-57e0a5f79669-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2l8f6\" (UID: \"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.048360 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-gn7q5"] Jan 21 09:14:37 crc kubenswrapper[4618]: W0121 09:14:37.052040 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3754650d_5a51_4b01_98e7_2575b5212346.slice/crio-36655c7dcab8703364d8ee21e9b4dcd1f3aded793fbd2f0c6184710e3a0840ab WatchSource:0}: Error finding container 36655c7dcab8703364d8ee21e9b4dcd1f3aded793fbd2f0c6184710e3a0840ab: Status 404 returned error can't find the container with id 36655c7dcab8703364d8ee21e9b4dcd1f3aded793fbd2f0c6184710e3a0840ab Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.229360 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.451280 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gn7q5" event={"ID":"3754650d-5a51-4b01-98e7-2575b5212346","Type":"ContainerStarted","Data":"5d4d73b07023a91874977df0a290df58b50203aa33f4485b3901919a6101d74e"} Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.451318 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gn7q5" event={"ID":"3754650d-5a51-4b01-98e7-2575b5212346","Type":"ContainerStarted","Data":"d8c3785f848a668275cb323a0851bec46a6919f00f8ec1fe1039a19cc066c65a"} Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.451330 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gn7q5" event={"ID":"3754650d-5a51-4b01-98e7-2575b5212346","Type":"ContainerStarted","Data":"36655c7dcab8703364d8ee21e9b4dcd1f3aded793fbd2f0c6184710e3a0840ab"} Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.451360 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.452343 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"3f35c46cdfb05f52ccec23072080b5c7ff440488816f71b76b49ad78587c2db9"} Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.467512 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-gn7q5" podStartSLOduration=1.467499756 podStartE2EDuration="1.467499756s" podCreationTimestamp="2026-01-21 09:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:14:37.465251461 +0000 UTC m=+676.215718778" watchObservedRunningTime="2026-01-21 09:14:37.467499756 +0000 UTC m=+676.217967073" Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.559601 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6"] Jan 21 09:14:37 crc kubenswrapper[4618]: W0121 09:14:37.563040 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b1f4460_bb9d_4f03_a4bd_57e0a5f79669.slice/crio-48b3da5d928ac15b718fded562c8b0b10c0e41825c22a5d3179e75359154cb31 WatchSource:0}: Error finding container 48b3da5d928ac15b718fded562c8b0b10c0e41825c22a5d3179e75359154cb31: Status 404 returned error can't find the container with id 48b3da5d928ac15b718fded562c8b0b10c0e41825c22a5d3179e75359154cb31 Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.981171 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:37 crc kubenswrapper[4618]: I0121 09:14:37.985070 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5acf067e-b50e-4176-8d97-18188382659a-memberlist\") pod \"speaker-bxvc2\" (UID: \"5acf067e-b50e-4176-8d97-18188382659a\") " pod="metallb-system/speaker-bxvc2" Jan 21 09:14:38 crc kubenswrapper[4618]: I0121 09:14:38.172151 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bxvc2" Jan 21 09:14:38 crc kubenswrapper[4618]: W0121 09:14:38.189355 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5acf067e_b50e_4176_8d97_18188382659a.slice/crio-00481abed659285745738f46edd01fd32c76875c6eef56afb11d21a569d3c440 WatchSource:0}: Error finding container 00481abed659285745738f46edd01fd32c76875c6eef56afb11d21a569d3c440: Status 404 returned error can't find the container with id 00481abed659285745738f46edd01fd32c76875c6eef56afb11d21a569d3c440 Jan 21 09:14:38 crc kubenswrapper[4618]: I0121 09:14:38.457310 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxvc2" event={"ID":"5acf067e-b50e-4176-8d97-18188382659a","Type":"ContainerStarted","Data":"80de201985ba54b736951cd4595a68d24ca12bfd86db4a6f1010dacfae51a812"} Jan 21 09:14:38 crc kubenswrapper[4618]: I0121 09:14:38.457345 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxvc2" event={"ID":"5acf067e-b50e-4176-8d97-18188382659a","Type":"ContainerStarted","Data":"00481abed659285745738f46edd01fd32c76875c6eef56afb11d21a569d3c440"} Jan 21 09:14:38 crc kubenswrapper[4618]: I0121 09:14:38.458907 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" event={"ID":"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669","Type":"ContainerStarted","Data":"48b3da5d928ac15b718fded562c8b0b10c0e41825c22a5d3179e75359154cb31"} Jan 21 09:14:39 crc kubenswrapper[4618]: I0121 09:14:39.476769 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxvc2" event={"ID":"5acf067e-b50e-4176-8d97-18188382659a","Type":"ContainerStarted","Data":"996e7f01535584c3662cb546183f7158d396b270fe82bc98ff6efdacae9357c0"} Jan 21 09:14:39 crc kubenswrapper[4618]: I0121 09:14:39.476923 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bxvc2" Jan 21 09:14:39 crc kubenswrapper[4618]: I0121 09:14:39.490052 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bxvc2" podStartSLOduration=3.4900375390000002 podStartE2EDuration="3.490037539s" podCreationTimestamp="2026-01-21 09:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:14:39.487680921 +0000 UTC m=+678.238148239" watchObservedRunningTime="2026-01-21 09:14:39.490037539 +0000 UTC m=+678.240504857" Jan 21 09:14:43 crc kubenswrapper[4618]: I0121 09:14:43.509734 4618 generic.go:334] "Generic (PLEG): container finished" podID="305963d0-7d19-440d-ba24-c836947123ab" containerID="d23a51d4db79311ff94eeb0b0a0a0401cfa851c0220c7eb586ed1a2c16eea1c1" exitCode=0 Jan 21 09:14:43 crc kubenswrapper[4618]: I0121 09:14:43.509940 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerDied","Data":"d23a51d4db79311ff94eeb0b0a0a0401cfa851c0220c7eb586ed1a2c16eea1c1"} Jan 21 09:14:43 crc kubenswrapper[4618]: I0121 09:14:43.514260 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" event={"ID":"0b1f4460-bb9d-4f03-a4bd-57e0a5f79669","Type":"ContainerStarted","Data":"cf21e494be6d09277967c6d9e7e3d1975d9613b500189a2c8cef91f6198412b7"} Jan 21 09:14:43 crc kubenswrapper[4618]: I0121 09:14:43.514389 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:43 crc kubenswrapper[4618]: I0121 09:14:43.546216 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" podStartSLOduration=2.713620843 podStartE2EDuration="7.546200267s" podCreationTimestamp="2026-01-21 09:14:36 +0000 UTC" firstStartedPulling="2026-01-21 09:14:37.564818224 +0000 UTC m=+676.315285541" lastFinishedPulling="2026-01-21 09:14:42.397397649 +0000 UTC m=+681.147864965" observedRunningTime="2026-01-21 09:14:43.544228502 +0000 UTC m=+682.294695819" watchObservedRunningTime="2026-01-21 09:14:43.546200267 +0000 UTC m=+682.296667585" Jan 21 09:14:44 crc kubenswrapper[4618]: I0121 09:14:44.519312 4618 generic.go:334] "Generic (PLEG): container finished" podID="305963d0-7d19-440d-ba24-c836947123ab" containerID="9a3b05017fd75227fadf5c5aacf404d6a3987b0f7ef433a85fad8c37f9214b60" exitCode=0 Jan 21 09:14:44 crc kubenswrapper[4618]: I0121 09:14:44.519396 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerDied","Data":"9a3b05017fd75227fadf5c5aacf404d6a3987b0f7ef433a85fad8c37f9214b60"} Jan 21 09:14:45 crc kubenswrapper[4618]: I0121 09:14:45.525297 4618 generic.go:334] "Generic (PLEG): container finished" podID="305963d0-7d19-440d-ba24-c836947123ab" containerID="ad33905e49bf740fda613345ef56586eb09270960e2186c59c7650385cc68df3" exitCode=0 Jan 21 09:14:45 crc kubenswrapper[4618]: I0121 09:14:45.525493 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerDied","Data":"ad33905e49bf740fda613345ef56586eb09270960e2186c59c7650385cc68df3"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534323 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"ff9ac8c440c3b647fd6a6ccd8b96eccc4de28b8a1c3ebb8975ace28d17ee8ebc"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534543 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534555 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"b95698d57d0882e2e967c3d9887bcadbc72cbce027e102310abd24e69e884442"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534565 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"04d520fe51872cc3f9f4b39cdc5173067127ea07e673e68babed9633a85ef63d"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534572 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"08d3b77a8508490e002de112291b84b749db7ed595ac5708a3ad38f68b3be90e"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534579 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"1478f5a79552c501686bf7a35790d29049ce1fc02783f7ee9f13795e6caa8825"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.534586 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fljjn" event={"ID":"305963d0-7d19-440d-ba24-c836947123ab","Type":"ContainerStarted","Data":"49d4fc5caa79de0526082a2f569903f56ead28bd30d4399994f64a4fc59e3313"} Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.550179 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fljjn" podStartSLOduration=4.887885257 podStartE2EDuration="10.55016753s" podCreationTimestamp="2026-01-21 09:14:36 +0000 UTC" firstStartedPulling="2026-01-21 09:14:36.729577971 +0000 UTC m=+675.480045288" lastFinishedPulling="2026-01-21 09:14:42.391860245 +0000 UTC m=+681.142327561" observedRunningTime="2026-01-21 09:14:46.54838481 +0000 UTC m=+685.298852127" watchObservedRunningTime="2026-01-21 09:14:46.55016753 +0000 UTC m=+685.300634847" Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.636269 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:46 crc kubenswrapper[4618]: I0121 09:14:46.663971 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:48 crc kubenswrapper[4618]: I0121 09:14:48.174376 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bxvc2" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.151427 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.152027 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.153300 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-klvqr" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.153355 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.153621 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.161690 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.238742 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxr2\" (UniqueName: \"kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2\") pod \"openstack-operator-index-wm8pg\" (UID: \"b89efe00-726c-434f-9532-ce375de73e5f\") " pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.340023 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxr2\" (UniqueName: \"kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2\") pod \"openstack-operator-index-wm8pg\" (UID: \"b89efe00-726c-434f-9532-ce375de73e5f\") " pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.354243 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxr2\" (UniqueName: \"kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2\") pod \"openstack-operator-index-wm8pg\" (UID: \"b89efe00-726c-434f-9532-ce375de73e5f\") " pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.465496 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:50 crc kubenswrapper[4618]: I0121 09:14:50.794481 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:51 crc kubenswrapper[4618]: I0121 09:14:51.564792 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wm8pg" event={"ID":"b89efe00-726c-434f-9532-ce375de73e5f","Type":"ContainerStarted","Data":"4bb23242e2787db4272b35f8a41d8650a72367d0a72004d00df58aa6f4008ebc"} Jan 21 09:14:52 crc kubenswrapper[4618]: I0121 09:14:52.570117 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wm8pg" event={"ID":"b89efe00-726c-434f-9532-ce375de73e5f","Type":"ContainerStarted","Data":"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb"} Jan 21 09:14:52 crc kubenswrapper[4618]: I0121 09:14:52.581862 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wm8pg" podStartSLOduration=1.489506323 podStartE2EDuration="2.581851669s" podCreationTimestamp="2026-01-21 09:14:50 +0000 UTC" firstStartedPulling="2026-01-21 09:14:50.798996634 +0000 UTC m=+689.549463952" lastFinishedPulling="2026-01-21 09:14:51.891341981 +0000 UTC m=+690.641809298" observedRunningTime="2026-01-21 09:14:52.579097164 +0000 UTC m=+691.329564481" watchObservedRunningTime="2026-01-21 09:14:52.581851669 +0000 UTC m=+691.332318986" Jan 21 09:14:53 crc kubenswrapper[4618]: I0121 09:14:53.542415 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.142913 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6m77l"] Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.143493 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.150824 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6m77l"] Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.182177 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwnz\" (UniqueName: \"kubernetes.io/projected/fa1a4914-7994-4004-b3aa-b3bbf62ed6df-kube-api-access-zhwnz\") pod \"openstack-operator-index-6m77l\" (UID: \"fa1a4914-7994-4004-b3aa-b3bbf62ed6df\") " pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.282876 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwnz\" (UniqueName: \"kubernetes.io/projected/fa1a4914-7994-4004-b3aa-b3bbf62ed6df-kube-api-access-zhwnz\") pod \"openstack-operator-index-6m77l\" (UID: \"fa1a4914-7994-4004-b3aa-b3bbf62ed6df\") " pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.297076 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwnz\" (UniqueName: \"kubernetes.io/projected/fa1a4914-7994-4004-b3aa-b3bbf62ed6df-kube-api-access-zhwnz\") pod \"openstack-operator-index-6m77l\" (UID: \"fa1a4914-7994-4004-b3aa-b3bbf62ed6df\") " pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.454508 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.578414 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wm8pg" podUID="b89efe00-726c-434f-9532-ce375de73e5f" containerName="registry-server" containerID="cri-o://71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb" gracePeriod=2 Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.786187 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6m77l"] Jan 21 09:14:54 crc kubenswrapper[4618]: W0121 09:14:54.787895 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1a4914_7994_4004_b3aa_b3bbf62ed6df.slice/crio-b55efb379067aa7b3288a5c339f7eb7a5160448c52e68390570fd79c928a2145 WatchSource:0}: Error finding container b55efb379067aa7b3288a5c339f7eb7a5160448c52e68390570fd79c928a2145: Status 404 returned error can't find the container with id b55efb379067aa7b3288a5c339f7eb7a5160448c52e68390570fd79c928a2145 Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.834662 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.889745 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxr2\" (UniqueName: \"kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2\") pod \"b89efe00-726c-434f-9532-ce375de73e5f\" (UID: \"b89efe00-726c-434f-9532-ce375de73e5f\") " Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.893409 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2" (OuterVolumeSpecName: "kube-api-access-ctxr2") pod "b89efe00-726c-434f-9532-ce375de73e5f" (UID: "b89efe00-726c-434f-9532-ce375de73e5f"). InnerVolumeSpecName "kube-api-access-ctxr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:14:54 crc kubenswrapper[4618]: I0121 09:14:54.991233 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctxr2\" (UniqueName: \"kubernetes.io/projected/b89efe00-726c-434f-9532-ce375de73e5f-kube-api-access-ctxr2\") on node \"crc\" DevicePath \"\"" Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.582393 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6m77l" event={"ID":"fa1a4914-7994-4004-b3aa-b3bbf62ed6df","Type":"ContainerStarted","Data":"b55efb379067aa7b3288a5c339f7eb7a5160448c52e68390570fd79c928a2145"} Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.583670 4618 generic.go:334] "Generic (PLEG): container finished" podID="b89efe00-726c-434f-9532-ce375de73e5f" containerID="71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb" exitCode=0 Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.583704 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wm8pg" event={"ID":"b89efe00-726c-434f-9532-ce375de73e5f","Type":"ContainerDied","Data":"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb"} Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.583715 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wm8pg" Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.583739 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wm8pg" event={"ID":"b89efe00-726c-434f-9532-ce375de73e5f","Type":"ContainerDied","Data":"4bb23242e2787db4272b35f8a41d8650a72367d0a72004d00df58aa6f4008ebc"} Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.583757 4618 scope.go:117] "RemoveContainer" containerID="71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb" Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.595481 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.598115 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wm8pg"] Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.598913 4618 scope.go:117] "RemoveContainer" containerID="71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb" Jan 21 09:14:55 crc kubenswrapper[4618]: E0121 09:14:55.599236 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb\": container with ID starting with 71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb not found: ID does not exist" containerID="71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb" Jan 21 09:14:55 crc kubenswrapper[4618]: I0121 09:14:55.599262 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb"} err="failed to get container status \"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb\": rpc error: code = NotFound desc = could not find container \"71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb\": container with ID starting with 71ba1d96f5ffdd983315cfa5cee6ef64a7d9fef5a171e16e14ac99571dc7d4fb not found: ID does not exist" Jan 21 09:14:56 crc kubenswrapper[4618]: I0121 09:14:56.589891 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6m77l" event={"ID":"fa1a4914-7994-4004-b3aa-b3bbf62ed6df","Type":"ContainerStarted","Data":"b4a193e7509429d34e455353fd6b96eec9ab14f85d0a64114d5cb121ba269321"} Jan 21 09:14:56 crc kubenswrapper[4618]: I0121 09:14:56.601119 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6m77l" podStartSLOduration=1.804390903 podStartE2EDuration="2.601106099s" podCreationTimestamp="2026-01-21 09:14:54 +0000 UTC" firstStartedPulling="2026-01-21 09:14:54.791757131 +0000 UTC m=+693.542224448" lastFinishedPulling="2026-01-21 09:14:55.588472327 +0000 UTC m=+694.338939644" observedRunningTime="2026-01-21 09:14:56.59812137 +0000 UTC m=+695.348588687" watchObservedRunningTime="2026-01-21 09:14:56.601106099 +0000 UTC m=+695.351573416" Jan 21 09:14:56 crc kubenswrapper[4618]: I0121 09:14:56.639046 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fljjn" Jan 21 09:14:56 crc kubenswrapper[4618]: I0121 09:14:56.714821 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-gn7q5" Jan 21 09:14:57 crc kubenswrapper[4618]: I0121 09:14:57.234993 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2l8f6" Jan 21 09:14:57 crc kubenswrapper[4618]: I0121 09:14:57.543398 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89efe00-726c-434f-9532-ce375de73e5f" path="/var/lib/kubelet/pods/b89efe00-726c-434f-9532-ce375de73e5f/volumes" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.132396 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl"] Jan 21 09:15:00 crc kubenswrapper[4618]: E0121 09:15:00.132593 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89efe00-726c-434f-9532-ce375de73e5f" containerName="registry-server" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.132605 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89efe00-726c-434f-9532-ce375de73e5f" containerName="registry-server" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.132709 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89efe00-726c-434f-9532-ce375de73e5f" containerName="registry-server" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.133053 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.134205 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.134848 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.138175 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl"] Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.242390 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.242676 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.242778 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjn4\" (UniqueName: \"kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.344081 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.344328 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.344351 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjn4\" (UniqueName: \"kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.344781 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.348466 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.356005 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjn4\" (UniqueName: \"kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4\") pod \"collect-profiles-29483115-lw6gl\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.446241 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:00 crc kubenswrapper[4618]: I0121 09:15:00.796855 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl"] Jan 21 09:15:00 crc kubenswrapper[4618]: W0121 09:15:00.799647 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36179011_6684_4dd8_9436_d4185ce96397.slice/crio-096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2 WatchSource:0}: Error finding container 096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2: Status 404 returned error can't find the container with id 096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2 Jan 21 09:15:01 crc kubenswrapper[4618]: I0121 09:15:01.610646 4618 generic.go:334] "Generic (PLEG): container finished" podID="36179011-6684-4dd8-9436-d4185ce96397" containerID="a0708cf98e07ec1dfbb2355dd82000acfef18763c3a1abdf9efa7a06d2161622" exitCode=0 Jan 21 09:15:01 crc kubenswrapper[4618]: I0121 09:15:01.610676 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" event={"ID":"36179011-6684-4dd8-9436-d4185ce96397","Type":"ContainerDied","Data":"a0708cf98e07ec1dfbb2355dd82000acfef18763c3a1abdf9efa7a06d2161622"} Jan 21 09:15:01 crc kubenswrapper[4618]: I0121 09:15:01.610695 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" event={"ID":"36179011-6684-4dd8-9436-d4185ce96397","Type":"ContainerStarted","Data":"096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2"} Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.794702 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.869353 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjn4\" (UniqueName: \"kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4\") pod \"36179011-6684-4dd8-9436-d4185ce96397\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.869397 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume\") pod \"36179011-6684-4dd8-9436-d4185ce96397\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.869445 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume\") pod \"36179011-6684-4dd8-9436-d4185ce96397\" (UID: \"36179011-6684-4dd8-9436-d4185ce96397\") " Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.870135 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume" (OuterVolumeSpecName: "config-volume") pod "36179011-6684-4dd8-9436-d4185ce96397" (UID: "36179011-6684-4dd8-9436-d4185ce96397"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.873475 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4" (OuterVolumeSpecName: "kube-api-access-wsjn4") pod "36179011-6684-4dd8-9436-d4185ce96397" (UID: "36179011-6684-4dd8-9436-d4185ce96397"). InnerVolumeSpecName "kube-api-access-wsjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.873581 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36179011-6684-4dd8-9436-d4185ce96397" (UID: "36179011-6684-4dd8-9436-d4185ce96397"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.970241 4618 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36179011-6684-4dd8-9436-d4185ce96397-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.970371 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36179011-6684-4dd8-9436-d4185ce96397-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:02 crc kubenswrapper[4618]: I0121 09:15:02.970434 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjn4\" (UniqueName: \"kubernetes.io/projected/36179011-6684-4dd8-9436-d4185ce96397-kube-api-access-wsjn4\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:03 crc kubenswrapper[4618]: I0121 09:15:03.622438 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" event={"ID":"36179011-6684-4dd8-9436-d4185ce96397","Type":"ContainerDied","Data":"096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2"} Jan 21 09:15:03 crc kubenswrapper[4618]: I0121 09:15:03.622634 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="096ae8f2ac7ae7ce670f4a320427194b0e1db017daf974d7e76fbd72b6d24ff2" Jan 21 09:15:03 crc kubenswrapper[4618]: I0121 09:15:03.622473 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.454878 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.454917 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.473960 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.643362 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6m77l" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.963258 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww"] Jan 21 09:15:04 crc kubenswrapper[4618]: E0121 09:15:04.963455 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36179011-6684-4dd8-9436-d4185ce96397" containerName="collect-profiles" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.963466 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="36179011-6684-4dd8-9436-d4185ce96397" containerName="collect-profiles" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.963569 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="36179011-6684-4dd8-9436-d4185ce96397" containerName="collect-profiles" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.964263 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.965677 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4xnkw" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.973422 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww"] Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.991016 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.991108 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzhwr\" (UniqueName: \"kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:04 crc kubenswrapper[4618]: I0121 09:15:04.991274 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.092654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzhwr\" (UniqueName: \"kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.092690 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.092736 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.093087 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.093369 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.106575 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzhwr\" (UniqueName: \"kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.275317 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.604136 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww"] Jan 21 09:15:05 crc kubenswrapper[4618]: W0121 09:15:05.607986 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0a2799_66d1_4406_b3da_33db634ae051.slice/crio-ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad WatchSource:0}: Error finding container ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad: Status 404 returned error can't find the container with id ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad Jan 21 09:15:05 crc kubenswrapper[4618]: I0121 09:15:05.631784 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" event={"ID":"1d0a2799-66d1-4406-b3da-33db634ae051","Type":"ContainerStarted","Data":"ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad"} Jan 21 09:15:06 crc kubenswrapper[4618]: I0121 09:15:06.637168 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d0a2799-66d1-4406-b3da-33db634ae051" containerID="f27a10a997bca1d1bfe106676d0851f46099514f732b5e04a097e2854752c27d" exitCode=0 Jan 21 09:15:06 crc kubenswrapper[4618]: I0121 09:15:06.637203 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" event={"ID":"1d0a2799-66d1-4406-b3da-33db634ae051","Type":"ContainerDied","Data":"f27a10a997bca1d1bfe106676d0851f46099514f732b5e04a097e2854752c27d"} Jan 21 09:15:07 crc kubenswrapper[4618]: I0121 09:15:07.642996 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d0a2799-66d1-4406-b3da-33db634ae051" containerID="b280f412fc44ec640d8d4e05ee08b26de8f979cd5f76b0a0386358cc9aeda19b" exitCode=0 Jan 21 09:15:07 crc kubenswrapper[4618]: I0121 09:15:07.643026 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" event={"ID":"1d0a2799-66d1-4406-b3da-33db634ae051","Type":"ContainerDied","Data":"b280f412fc44ec640d8d4e05ee08b26de8f979cd5f76b0a0386358cc9aeda19b"} Jan 21 09:15:08 crc kubenswrapper[4618]: I0121 09:15:08.649730 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d0a2799-66d1-4406-b3da-33db634ae051" containerID="d24a3d4113fbf39b1d09abc00b5c36b5a6f88a96e16cbdac75d36d045489cbf2" exitCode=0 Jan 21 09:15:08 crc kubenswrapper[4618]: I0121 09:15:08.649768 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" event={"ID":"1d0a2799-66d1-4406-b3da-33db634ae051","Type":"ContainerDied","Data":"d24a3d4113fbf39b1d09abc00b5c36b5a6f88a96e16cbdac75d36d045489cbf2"} Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.830335 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.935749 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util\") pod \"1d0a2799-66d1-4406-b3da-33db634ae051\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.935874 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle\") pod \"1d0a2799-66d1-4406-b3da-33db634ae051\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.935896 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzhwr\" (UniqueName: \"kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr\") pod \"1d0a2799-66d1-4406-b3da-33db634ae051\" (UID: \"1d0a2799-66d1-4406-b3da-33db634ae051\") " Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.936348 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle" (OuterVolumeSpecName: "bundle") pod "1d0a2799-66d1-4406-b3da-33db634ae051" (UID: "1d0a2799-66d1-4406-b3da-33db634ae051"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.940365 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr" (OuterVolumeSpecName: "kube-api-access-rzhwr") pod "1d0a2799-66d1-4406-b3da-33db634ae051" (UID: "1d0a2799-66d1-4406-b3da-33db634ae051"). InnerVolumeSpecName "kube-api-access-rzhwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:15:09 crc kubenswrapper[4618]: I0121 09:15:09.946104 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util" (OuterVolumeSpecName: "util") pod "1d0a2799-66d1-4406-b3da-33db634ae051" (UID: "1d0a2799-66d1-4406-b3da-33db634ae051"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.036749 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzhwr\" (UniqueName: \"kubernetes.io/projected/1d0a2799-66d1-4406-b3da-33db634ae051-kube-api-access-rzhwr\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.036774 4618 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-util\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.036785 4618 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0a2799-66d1-4406-b3da-33db634ae051-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.658541 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" event={"ID":"1d0a2799-66d1-4406-b3da-33db634ae051","Type":"ContainerDied","Data":"ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad"} Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.658572 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac459fe63d69b85e1f7e715a351f4096cd117544543842bbc7983225968e65ad" Jan 21 09:15:10 crc kubenswrapper[4618]: I0121 09:15:10.658594 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.295908 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s"] Jan 21 09:15:17 crc kubenswrapper[4618]: E0121 09:15:17.296353 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="util" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.296365 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="util" Jan 21 09:15:17 crc kubenswrapper[4618]: E0121 09:15:17.296376 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="pull" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.296382 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="pull" Jan 21 09:15:17 crc kubenswrapper[4618]: E0121 09:15:17.296389 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="extract" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.296394 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="extract" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.296501 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0a2799-66d1-4406-b3da-33db634ae051" containerName="extract" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.296862 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.298374 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-wwxlm" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.327651 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s"] Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.409951 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjpv\" (UniqueName: \"kubernetes.io/projected/049e7414-823b-45cc-92e6-da0652157046-kube-api-access-btjpv\") pod \"openstack-operator-controller-init-6d4d7d8545-hbl4s\" (UID: \"049e7414-823b-45cc-92e6-da0652157046\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.511345 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btjpv\" (UniqueName: \"kubernetes.io/projected/049e7414-823b-45cc-92e6-da0652157046-kube-api-access-btjpv\") pod \"openstack-operator-controller-init-6d4d7d8545-hbl4s\" (UID: \"049e7414-823b-45cc-92e6-da0652157046\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.531082 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjpv\" (UniqueName: \"kubernetes.io/projected/049e7414-823b-45cc-92e6-da0652157046-kube-api-access-btjpv\") pod \"openstack-operator-controller-init-6d4d7d8545-hbl4s\" (UID: \"049e7414-823b-45cc-92e6-da0652157046\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.609694 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:17 crc kubenswrapper[4618]: I0121 09:15:17.964347 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s"] Jan 21 09:15:18 crc kubenswrapper[4618]: I0121 09:15:18.691330 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" event={"ID":"049e7414-823b-45cc-92e6-da0652157046","Type":"ContainerStarted","Data":"4490484cef155292ef308a9d1b47cf2669d3a0d8a4b53f817057087114cd786b"} Jan 21 09:15:21 crc kubenswrapper[4618]: I0121 09:15:21.715917 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" event={"ID":"049e7414-823b-45cc-92e6-da0652157046","Type":"ContainerStarted","Data":"91f61aaf5de4c3ea801b79322e0894b26635b6313c300b0e686d482276a7965e"} Jan 21 09:15:21 crc kubenswrapper[4618]: I0121 09:15:21.716929 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:21 crc kubenswrapper[4618]: I0121 09:15:21.744519 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" podStartSLOduration=1.620297371 podStartE2EDuration="4.744503652s" podCreationTimestamp="2026-01-21 09:15:17 +0000 UTC" firstStartedPulling="2026-01-21 09:15:17.971874486 +0000 UTC m=+716.722341803" lastFinishedPulling="2026-01-21 09:15:21.096080767 +0000 UTC m=+719.846548084" observedRunningTime="2026-01-21 09:15:21.737469866 +0000 UTC m=+720.487937183" watchObservedRunningTime="2026-01-21 09:15:21.744503652 +0000 UTC m=+720.494970969" Jan 21 09:15:27 crc kubenswrapper[4618]: I0121 09:15:27.611963 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-hbl4s" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.451577 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.452641 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.458781 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.459730 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.466605 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hzvql" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.466838 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jwnfq" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.471834 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.476577 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-nf54z"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.477406 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.482457 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-n7s9w" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.482618 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.491333 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.492097 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.493389 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-55zxr" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.499481 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-nf54z"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.524071 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.524997 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpx5j\" (UniqueName: \"kubernetes.io/projected/982d4204-447a-43c3-858e-c16cceebf1bb-kube-api-access-jpx5j\") pod \"barbican-operator-controller-manager-7ddb5c749-6j9f2\" (UID: \"982d4204-447a-43c3-858e-c16cceebf1bb\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.525126 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dw6\" (UniqueName: \"kubernetes.io/projected/f3975776-d0c3-478c-873c-349415bf2d3c-kube-api-access-n8dw6\") pod \"designate-operator-controller-manager-9f958b845-nf54z\" (UID: \"f3975776-d0c3-478c-873c-349415bf2d3c\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.525308 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jxf\" (UniqueName: \"kubernetes.io/projected/e0011800-e28a-4e71-8306-819d8d865dfe-kube-api-access-72jxf\") pod \"glance-operator-controller-manager-c6994669c-4r2qm\" (UID: \"e0011800-e28a-4e71-8306-819d8d865dfe\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.525516 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4sh\" (UniqueName: \"kubernetes.io/projected/d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4-kube-api-access-fp4sh\") pod \"cinder-operator-controller-manager-9b68f5989-6zn64\" (UID: \"d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.527923 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.528608 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.530398 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ncmtg" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.547448 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.548100 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.549447 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.550317 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rlnlk" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.554493 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.562225 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.562949 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.564217 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l7s4l" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.564886 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.569705 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.570451 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.575056 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fv46x" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.576021 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.580211 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.580897 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.583523 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4wwp6" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.589628 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.599602 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.600592 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.602340 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7cqsp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.607175 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.612595 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.613237 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.615657 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jm64r" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.618744 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.620917 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.625996 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7gqq\" (UniqueName: \"kubernetes.io/projected/80cee31f-467d-4c99-8b58-1edbee74f4a9-kube-api-access-h7gqq\") pod \"ironic-operator-controller-manager-78757b4889-g58rl\" (UID: \"80cee31f-467d-4c99-8b58-1edbee74f4a9\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626024 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xms\" (UniqueName: \"kubernetes.io/projected/69396ad4-b4ad-4f43-a0f5-83b655e590da-kube-api-access-66xms\") pod \"keystone-operator-controller-manager-767fdc4f47-l55q5\" (UID: \"69396ad4-b4ad-4f43-a0f5-83b655e590da\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626045 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjq9t\" (UniqueName: \"kubernetes.io/projected/276f144f-a185-46da-a3af-f0aa8a9eaaad-kube-api-access-pjq9t\") pod \"heat-operator-controller-manager-594c8c9d5d-ms7zc\" (UID: \"276f144f-a185-46da-a3af-f0aa8a9eaaad\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626080 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpx5j\" (UniqueName: \"kubernetes.io/projected/982d4204-447a-43c3-858e-c16cceebf1bb-kube-api-access-jpx5j\") pod \"barbican-operator-controller-manager-7ddb5c749-6j9f2\" (UID: \"982d4204-447a-43c3-858e-c16cceebf1bb\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626100 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptmf\" (UniqueName: \"kubernetes.io/projected/61c3771f-ea2c-4307-8d5b-7f44194235cd-kube-api-access-nptmf\") pod \"manila-operator-controller-manager-864f6b75bf-djc75\" (UID: \"61c3771f-ea2c-4307-8d5b-7f44194235cd\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626117 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dw6\" (UniqueName: \"kubernetes.io/projected/f3975776-d0c3-478c-873c-349415bf2d3c-kube-api-access-n8dw6\") pod \"designate-operator-controller-manager-9f958b845-nf54z\" (UID: \"f3975776-d0c3-478c-873c-349415bf2d3c\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626132 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626160 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frrqz\" (UniqueName: \"kubernetes.io/projected/cad4873a-5a2e-40ea-a4b1-3173e8138be0-kube-api-access-frrqz\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626181 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvpl\" (UniqueName: \"kubernetes.io/projected/0ff11d9c-92c7-4b78-8336-70e117f63880-kube-api-access-2tvpl\") pod \"horizon-operator-controller-manager-77d5c5b54f-bd65l\" (UID: \"0ff11d9c-92c7-4b78-8336-70e117f63880\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626197 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jxf\" (UniqueName: \"kubernetes.io/projected/e0011800-e28a-4e71-8306-819d8d865dfe-kube-api-access-72jxf\") pod \"glance-operator-controller-manager-c6994669c-4r2qm\" (UID: \"e0011800-e28a-4e71-8306-819d8d865dfe\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.626221 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4sh\" (UniqueName: \"kubernetes.io/projected/d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4-kube-api-access-fp4sh\") pod \"cinder-operator-controller-manager-9b68f5989-6zn64\" (UID: \"d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.627563 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.628352 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.630417 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-q2v85" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.638495 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.642449 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4sh\" (UniqueName: \"kubernetes.io/projected/d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4-kube-api-access-fp4sh\") pod \"cinder-operator-controller-manager-9b68f5989-6zn64\" (UID: \"d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.644711 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpx5j\" (UniqueName: \"kubernetes.io/projected/982d4204-447a-43c3-858e-c16cceebf1bb-kube-api-access-jpx5j\") pod \"barbican-operator-controller-manager-7ddb5c749-6j9f2\" (UID: \"982d4204-447a-43c3-858e-c16cceebf1bb\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.646293 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.646847 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.650192 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rlzl6" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.650251 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dw6\" (UniqueName: \"kubernetes.io/projected/f3975776-d0c3-478c-873c-349415bf2d3c-kube-api-access-n8dw6\") pod \"designate-operator-controller-manager-9f958b845-nf54z\" (UID: \"f3975776-d0c3-478c-873c-349415bf2d3c\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.654449 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jxf\" (UniqueName: \"kubernetes.io/projected/e0011800-e28a-4e71-8306-819d8d865dfe-kube-api-access-72jxf\") pod \"glance-operator-controller-manager-c6994669c-4r2qm\" (UID: \"e0011800-e28a-4e71-8306-819d8d865dfe\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.661378 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.662071 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.663216 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qgczp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.671107 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.674705 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.681233 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.682476 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.684022 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4jg58" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.686689 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.687661 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.688967 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.689111 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f76tn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.705361 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.716347 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727034 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/14908c8c-b444-4359-9e3a-e0fcc443e9f7-kube-api-access-r68qr\") pod \"nova-operator-controller-manager-65849867d6-j5xjz\" (UID: \"14908c8c-b444-4359-9e3a-e0fcc443e9f7\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727082 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptmf\" (UniqueName: \"kubernetes.io/projected/61c3771f-ea2c-4307-8d5b-7f44194235cd-kube-api-access-nptmf\") pod \"manila-operator-controller-manager-864f6b75bf-djc75\" (UID: \"61c3771f-ea2c-4307-8d5b-7f44194235cd\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727104 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhgcg\" (UniqueName: \"kubernetes.io/projected/0ec13d1d-fae7-4efd-92d6-0b93f972694f-kube-api-access-rhgcg\") pod \"mariadb-operator-controller-manager-c87fff755-lsgpp\" (UID: \"0ec13d1d-fae7-4efd-92d6-0b93f972694f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727131 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjkvw\" (UniqueName: \"kubernetes.io/projected/b662a5ae-39f6-4592-baf2-efa15f7c82b0-kube-api-access-fjkvw\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727161 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ntp\" (UniqueName: \"kubernetes.io/projected/f0bde946-f6c9-45a5-a124-6cf62551f0bc-kube-api-access-w7ntp\") pod \"neutron-operator-controller-manager-cb4666565-5m9wn\" (UID: \"f0bde946-f6c9-45a5-a124-6cf62551f0bc\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727178 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727195 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frrqz\" (UniqueName: \"kubernetes.io/projected/cad4873a-5a2e-40ea-a4b1-3173e8138be0-kube-api-access-frrqz\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.727377 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-r895x"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728171 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728465 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvpl\" (UniqueName: \"kubernetes.io/projected/0ff11d9c-92c7-4b78-8336-70e117f63880-kube-api-access-2tvpl\") pod \"horizon-operator-controller-manager-77d5c5b54f-bd65l\" (UID: \"0ff11d9c-92c7-4b78-8336-70e117f63880\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:45 crc kubenswrapper[4618]: E0121 09:15:45.728582 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:45 crc kubenswrapper[4618]: E0121 09:15:45.728712 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:46.22869806 +0000 UTC m=+744.979165376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728769 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728808 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqw74\" (UniqueName: \"kubernetes.io/projected/b3629416-c45e-46da-98ba-dfd8b6630abd-kube-api-access-kqw74\") pod \"ovn-operator-controller-manager-55db956ddc-7nkmc\" (UID: \"b3629416-c45e-46da-98ba-dfd8b6630abd\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728890 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7gqq\" (UniqueName: \"kubernetes.io/projected/80cee31f-467d-4c99-8b58-1edbee74f4a9-kube-api-access-h7gqq\") pod \"ironic-operator-controller-manager-78757b4889-g58rl\" (UID: \"80cee31f-467d-4c99-8b58-1edbee74f4a9\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728915 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwql\" (UniqueName: \"kubernetes.io/projected/1739988f-1de9-4c68-85ac-c14971105314-kube-api-access-btwql\") pod \"octavia-operator-controller-manager-7fc9b76cf6-cmhx4\" (UID: \"1739988f-1de9-4c68-85ac-c14971105314\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728938 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66xms\" (UniqueName: \"kubernetes.io/projected/69396ad4-b4ad-4f43-a0f5-83b655e590da-kube-api-access-66xms\") pod \"keystone-operator-controller-manager-767fdc4f47-l55q5\" (UID: \"69396ad4-b4ad-4f43-a0f5-83b655e590da\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.728979 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjq9t\" (UniqueName: \"kubernetes.io/projected/276f144f-a185-46da-a3af-f0aa8a9eaaad-kube-api-access-pjq9t\") pod \"heat-operator-controller-manager-594c8c9d5d-ms7zc\" (UID: \"276f144f-a185-46da-a3af-f0aa8a9eaaad\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.732051 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7qhlv" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.732423 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-r895x"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.746820 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptmf\" (UniqueName: \"kubernetes.io/projected/61c3771f-ea2c-4307-8d5b-7f44194235cd-kube-api-access-nptmf\") pod \"manila-operator-controller-manager-864f6b75bf-djc75\" (UID: \"61c3771f-ea2c-4307-8d5b-7f44194235cd\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.746919 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7gqq\" (UniqueName: \"kubernetes.io/projected/80cee31f-467d-4c99-8b58-1edbee74f4a9-kube-api-access-h7gqq\") pod \"ironic-operator-controller-manager-78757b4889-g58rl\" (UID: \"80cee31f-467d-4c99-8b58-1edbee74f4a9\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.756850 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjq9t\" (UniqueName: \"kubernetes.io/projected/276f144f-a185-46da-a3af-f0aa8a9eaaad-kube-api-access-pjq9t\") pod \"heat-operator-controller-manager-594c8c9d5d-ms7zc\" (UID: \"276f144f-a185-46da-a3af-f0aa8a9eaaad\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.758426 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvpl\" (UniqueName: \"kubernetes.io/projected/0ff11d9c-92c7-4b78-8336-70e117f63880-kube-api-access-2tvpl\") pod \"horizon-operator-controller-manager-77d5c5b54f-bd65l\" (UID: \"0ff11d9c-92c7-4b78-8336-70e117f63880\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.758878 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frrqz\" (UniqueName: \"kubernetes.io/projected/cad4873a-5a2e-40ea-a4b1-3173e8138be0-kube-api-access-frrqz\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.759357 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xms\" (UniqueName: \"kubernetes.io/projected/69396ad4-b4ad-4f43-a0f5-83b655e590da-kube-api-access-66xms\") pod \"keystone-operator-controller-manager-767fdc4f47-l55q5\" (UID: \"69396ad4-b4ad-4f43-a0f5-83b655e590da\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.764223 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.765022 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.766515 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-85654" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.767611 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.771558 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.784182 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.798460 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.806859 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832645 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/14908c8c-b444-4359-9e3a-e0fcc443e9f7-kube-api-access-r68qr\") pod \"nova-operator-controller-manager-65849867d6-j5xjz\" (UID: \"14908c8c-b444-4359-9e3a-e0fcc443e9f7\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832702 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhgcg\" (UniqueName: \"kubernetes.io/projected/0ec13d1d-fae7-4efd-92d6-0b93f972694f-kube-api-access-rhgcg\") pod \"mariadb-operator-controller-manager-c87fff755-lsgpp\" (UID: \"0ec13d1d-fae7-4efd-92d6-0b93f972694f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832733 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjkvw\" (UniqueName: \"kubernetes.io/projected/b662a5ae-39f6-4592-baf2-efa15f7c82b0-kube-api-access-fjkvw\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832756 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ntp\" (UniqueName: \"kubernetes.io/projected/f0bde946-f6c9-45a5-a124-6cf62551f0bc-kube-api-access-w7ntp\") pod \"neutron-operator-controller-manager-cb4666565-5m9wn\" (UID: \"f0bde946-f6c9-45a5-a124-6cf62551f0bc\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832803 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832825 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqw74\" (UniqueName: \"kubernetes.io/projected/b3629416-c45e-46da-98ba-dfd8b6630abd-kube-api-access-kqw74\") pod \"ovn-operator-controller-manager-55db956ddc-7nkmc\" (UID: \"b3629416-c45e-46da-98ba-dfd8b6630abd\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832859 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwql\" (UniqueName: \"kubernetes.io/projected/1739988f-1de9-4c68-85ac-c14971105314-kube-api-access-btwql\") pod \"octavia-operator-controller-manager-7fc9b76cf6-cmhx4\" (UID: \"1739988f-1de9-4c68-85ac-c14971105314\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832888 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrz4\" (UniqueName: \"kubernetes.io/projected/1f7120e5-8e39-4664-9d63-beaea1ff4043-kube-api-access-jtrz4\") pod \"placement-operator-controller-manager-686df47fcb-r895x\" (UID: \"1f7120e5-8e39-4664-9d63-beaea1ff4043\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.832910 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx9zm\" (UniqueName: \"kubernetes.io/projected/5af2019b-e469-403f-8c3e-91006f2902ad-kube-api-access-jx9zm\") pod \"swift-operator-controller-manager-85dd56d4cc-zgrxl\" (UID: \"5af2019b-e469-403f-8c3e-91006f2902ad\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:15:45 crc kubenswrapper[4618]: E0121 09:15:45.833690 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:45 crc kubenswrapper[4618]: E0121 09:15:45.833731 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:46.333718965 +0000 UTC m=+745.084186282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.847017 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68qr\" (UniqueName: \"kubernetes.io/projected/14908c8c-b444-4359-9e3a-e0fcc443e9f7-kube-api-access-r68qr\") pod \"nova-operator-controller-manager-65849867d6-j5xjz\" (UID: \"14908c8c-b444-4359-9e3a-e0fcc443e9f7\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.853706 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhgcg\" (UniqueName: \"kubernetes.io/projected/0ec13d1d-fae7-4efd-92d6-0b93f972694f-kube-api-access-rhgcg\") pod \"mariadb-operator-controller-manager-c87fff755-lsgpp\" (UID: \"0ec13d1d-fae7-4efd-92d6-0b93f972694f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.857270 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwql\" (UniqueName: \"kubernetes.io/projected/1739988f-1de9-4c68-85ac-c14971105314-kube-api-access-btwql\") pod \"octavia-operator-controller-manager-7fc9b76cf6-cmhx4\" (UID: \"1739988f-1de9-4c68-85ac-c14971105314\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.858913 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjkvw\" (UniqueName: \"kubernetes.io/projected/b662a5ae-39f6-4592-baf2-efa15f7c82b0-kube-api-access-fjkvw\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.860666 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.862103 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqw74\" (UniqueName: \"kubernetes.io/projected/b3629416-c45e-46da-98ba-dfd8b6630abd-kube-api-access-kqw74\") pod \"ovn-operator-controller-manager-55db956ddc-7nkmc\" (UID: \"b3629416-c45e-46da-98ba-dfd8b6630abd\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.863674 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ntp\" (UniqueName: \"kubernetes.io/projected/f0bde946-f6c9-45a5-a124-6cf62551f0bc-kube-api-access-w7ntp\") pod \"neutron-operator-controller-manager-cb4666565-5m9wn\" (UID: \"f0bde946-f6c9-45a5-a124-6cf62551f0bc\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.864903 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.876409 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.883183 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.884363 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.886697 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wxxmc" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.898629 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.912914 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.922832 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.928130 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.934284 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtfhp\" (UniqueName: \"kubernetes.io/projected/16d3b481-106a-48ee-b99c-7a380086a9cd-kube-api-access-wtfhp\") pod \"telemetry-operator-controller-manager-5f8f495fcf-778qv\" (UID: \"16d3b481-106a-48ee-b99c-7a380086a9cd\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.934339 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrz4\" (UniqueName: \"kubernetes.io/projected/1f7120e5-8e39-4664-9d63-beaea1ff4043-kube-api-access-jtrz4\") pod \"placement-operator-controller-manager-686df47fcb-r895x\" (UID: \"1f7120e5-8e39-4664-9d63-beaea1ff4043\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.934361 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx9zm\" (UniqueName: \"kubernetes.io/projected/5af2019b-e469-403f-8c3e-91006f2902ad-kube-api-access-jx9zm\") pod \"swift-operator-controller-manager-85dd56d4cc-zgrxl\" (UID: \"5af2019b-e469-403f-8c3e-91006f2902ad\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.940353 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.950222 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrz4\" (UniqueName: \"kubernetes.io/projected/1f7120e5-8e39-4664-9d63-beaea1ff4043-kube-api-access-jtrz4\") pod \"placement-operator-controller-manager-686df47fcb-r895x\" (UID: \"1f7120e5-8e39-4664-9d63-beaea1ff4043\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.951010 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx9zm\" (UniqueName: \"kubernetes.io/projected/5af2019b-e469-403f-8c3e-91006f2902ad-kube-api-access-jx9zm\") pod \"swift-operator-controller-manager-85dd56d4cc-zgrxl\" (UID: \"5af2019b-e469-403f-8c3e-91006f2902ad\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.952221 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.954425 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.957378 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd"] Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.957562 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mnnj7" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.979595 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.991711 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:45 crc kubenswrapper[4618]: I0121 09:15:45.999531 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.035591 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf986\" (UniqueName: \"kubernetes.io/projected/e4f5bddf-5e04-4510-903b-6861f19fa87b-kube-api-access-sf986\") pod \"test-operator-controller-manager-7cd8bc9dbb-g4khd\" (UID: \"e4f5bddf-5e04-4510-903b-6861f19fa87b\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.035661 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtfhp\" (UniqueName: \"kubernetes.io/projected/16d3b481-106a-48ee-b99c-7a380086a9cd-kube-api-access-wtfhp\") pod \"telemetry-operator-controller-manager-5f8f495fcf-778qv\" (UID: \"16d3b481-106a-48ee-b99c-7a380086a9cd\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.047171 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.058548 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtfhp\" (UniqueName: \"kubernetes.io/projected/16d3b481-106a-48ee-b99c-7a380086a9cd-kube-api-access-wtfhp\") pod \"telemetry-operator-controller-manager-5f8f495fcf-778qv\" (UID: \"16d3b481-106a-48ee-b99c-7a380086a9cd\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.063488 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.064223 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.066624 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mcqwx" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.070876 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.100084 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.138424 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdckq\" (UniqueName: \"kubernetes.io/projected/010792a0-26fd-456a-9186-79799c9a511e-kube-api-access-fdckq\") pod \"watcher-operator-controller-manager-64cd966744-czzg6\" (UID: \"010792a0-26fd-456a-9186-79799c9a511e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.138903 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf986\" (UniqueName: \"kubernetes.io/projected/e4f5bddf-5e04-4510-903b-6861f19fa87b-kube-api-access-sf986\") pod \"test-operator-controller-manager-7cd8bc9dbb-g4khd\" (UID: \"e4f5bddf-5e04-4510-903b-6861f19fa87b\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.163080 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.163781 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.166053 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.166236 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.167278 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf986\" (UniqueName: \"kubernetes.io/projected/e4f5bddf-5e04-4510-903b-6861f19fa87b-kube-api-access-sf986\") pod \"test-operator-controller-manager-7cd8bc9dbb-g4khd\" (UID: \"e4f5bddf-5e04-4510-903b-6861f19fa87b\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.167460 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-f28pd" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.171066 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.177362 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.205715 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.243763 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wbx\" (UniqueName: \"kubernetes.io/projected/cfa3b66e-c251-46f7-ade1-edd4df56db67-kube-api-access-r7wbx\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.243800 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.243829 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.243883 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdckq\" (UniqueName: \"kubernetes.io/projected/010792a0-26fd-456a-9186-79799c9a511e-kube-api-access-fdckq\") pod \"watcher-operator-controller-manager-64cd966744-czzg6\" (UID: \"010792a0-26fd-456a-9186-79799c9a511e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.243932 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.244035 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.244074 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:47.244062713 +0000 UTC m=+745.994530020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.268565 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.269541 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.270344 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.272109 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xqq54" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.275425 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.289159 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdckq\" (UniqueName: \"kubernetes.io/projected/010792a0-26fd-456a-9186-79799c9a511e-kube-api-access-fdckq\") pod \"watcher-operator-controller-manager-64cd966744-czzg6\" (UID: \"010792a0-26fd-456a-9186-79799c9a511e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.344935 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.345005 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wbx\" (UniqueName: \"kubernetes.io/projected/cfa3b66e-c251-46f7-ade1-edd4df56db67-kube-api-access-r7wbx\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.345030 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.345052 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.345085 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvx6h\" (UniqueName: \"kubernetes.io/projected/1bab5bac-6dfb-48f0-bf21-71dbfb2d3653-kube-api-access-fvx6h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9nmj5\" (UID: \"1bab5bac-6dfb-48f0-bf21-71dbfb2d3653\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345269 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345308 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:46.845294883 +0000 UTC m=+745.595762200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345370 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345425 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:46.845408788 +0000 UTC m=+745.595876104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345269 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.345744 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:47.345725613 +0000 UTC m=+746.096192930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.366459 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wbx\" (UniqueName: \"kubernetes.io/projected/cfa3b66e-c251-46f7-ade1-edd4df56db67-kube-api-access-r7wbx\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.401010 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.445806 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvx6h\" (UniqueName: \"kubernetes.io/projected/1bab5bac-6dfb-48f0-bf21-71dbfb2d3653-kube-api-access-fvx6h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9nmj5\" (UID: \"1bab5bac-6dfb-48f0-bf21-71dbfb2d3653\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.471331 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvx6h\" (UniqueName: \"kubernetes.io/projected/1bab5bac-6dfb-48f0-bf21-71dbfb2d3653-kube-api-access-fvx6h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-9nmj5\" (UID: \"1bab5bac-6dfb-48f0-bf21-71dbfb2d3653\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.502327 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.506086 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-nf54z"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.528470 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm"] Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.550384 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0011800_e28a_4e71_8306_819d8d865dfe.slice/crio-71d0a4082f256417d2edb0777a5fd3df702fd6d23b9fd73575f944ebb404b708 WatchSource:0}: Error finding container 71d0a4082f256417d2edb0777a5fd3df702fd6d23b9fd73575f944ebb404b708: Status 404 returned error can't find the container with id 71d0a4082f256417d2edb0777a5fd3df702fd6d23b9fd73575f944ebb404b708 Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.632331 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l"] Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.634554 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff11d9c_92c7_4b78_8336_70e117f63880.slice/crio-44b99d0f2e05856d41582dee2acfda1004c6b48203d966efbd6b7960bebe7958 WatchSource:0}: Error finding container 44b99d0f2e05856d41582dee2acfda1004c6b48203d966efbd6b7960bebe7958: Status 404 returned error can't find the container with id 44b99d0f2e05856d41582dee2acfda1004c6b48203d966efbd6b7960bebe7958 Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.645335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.656094 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.729013 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.735442 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.742718 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.828853 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" event={"ID":"d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4","Type":"ContainerStarted","Data":"02e2e05adfcce581b1c9d7272acae654c888bb658ee9897cc7f2b7fb98af61e2"} Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.830539 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14908c8c_b444_4359_9e3a_e0fcc443e9f7.slice/crio-e30af55a7027056dfa8c164e3bdb6a680059dec32d56733b7db13b0f5c4c239b WatchSource:0}: Error finding container e30af55a7027056dfa8c164e3bdb6a680059dec32d56733b7db13b0f5c4c239b: Status 404 returned error can't find the container with id e30af55a7027056dfa8c164e3bdb6a680059dec32d56733b7db13b0f5c4c239b Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.830818 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz"] Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.831626 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61c3771f_ea2c_4307_8d5b_7f44194235cd.slice/crio-353a040b21445b73925b01a6f4752d63e677eb545650e95a0b354ed376a51564 WatchSource:0}: Error finding container 353a040b21445b73925b01a6f4752d63e677eb545650e95a0b354ed376a51564: Status 404 returned error can't find the container with id 353a040b21445b73925b01a6f4752d63e677eb545650e95a0b354ed376a51564 Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.834939 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" event={"ID":"f3975776-d0c3-478c-873c-349415bf2d3c","Type":"ContainerStarted","Data":"8f1dbe927f1143f58e5c6357ccba2b1e5c0889abff82820406108d7730f6abdd"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.838332 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.840051 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4"] Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.843780 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btwql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-cmhx4_openstack-operators(1739988f-1de9-4c68-85ac-c14971105314): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.843973 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5"] Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.844597 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" event={"ID":"80cee31f-467d-4c99-8b58-1edbee74f4a9","Type":"ContainerStarted","Data":"ac1e619220e7809d64e3fa20229f94ea7ad671377251daea3af237d04ded5dc5"} Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.844858 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" podUID="1739988f-1de9-4c68-85ac-c14971105314" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.845505 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" event={"ID":"e0011800-e28a-4e71-8306-819d8d865dfe","Type":"ContainerStarted","Data":"71d0a4082f256417d2edb0777a5fd3df702fd6d23b9fd73575f944ebb404b708"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.849592 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" event={"ID":"0ec13d1d-fae7-4efd-92d6-0b93f972694f","Type":"ContainerStarted","Data":"decc688234ff2d27ec180ea16e9ca272ee5265ee2f9a7aa0a6a22e0181bb94f6"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.850632 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.850663 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.850812 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.850845 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:47.850833222 +0000 UTC m=+746.601300539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.850880 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.850898 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:47.850892033 +0000 UTC m=+746.601359350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.851814 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" event={"ID":"0ff11d9c-92c7-4b78-8336-70e117f63880","Type":"ContainerStarted","Data":"44b99d0f2e05856d41582dee2acfda1004c6b48203d966efbd6b7960bebe7958"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.852424 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" event={"ID":"982d4204-447a-43c3-858e-c16cceebf1bb","Type":"ContainerStarted","Data":"0ad6cf94953d5f52585ebcabcfc13547f0604a160850bde9e96004a266d33358"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.853008 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" event={"ID":"f0bde946-f6c9-45a5-a124-6cf62551f0bc","Type":"ContainerStarted","Data":"2e69dfe7b6820e31c11597fdac9bb6abe495d8ed095a2b85d7e72a2d95783528"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.854366 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" event={"ID":"276f144f-a185-46da-a3af-f0aa8a9eaaad","Type":"ContainerStarted","Data":"7bd86d1a2468fa11b28d40eff58cbaad5b941d5142ed3252d739fa5203e45ee1"} Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.891871 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5"] Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.895555 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bab5bac_6dfb_48f0_bf21_71dbfb2d3653.slice/crio-7a3a826416e31775c56452933088d43122d36b70c8a9f211a6d4e42cf7fde22b WatchSource:0}: Error finding container 7a3a826416e31775c56452933088d43122d36b70c8a9f211a6d4e42cf7fde22b: Status 404 returned error can't find the container with id 7a3a826416e31775c56452933088d43122d36b70c8a9f211a6d4e42cf7fde22b Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.897296 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fvx6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-9nmj5_openstack-operators(1bab5bac-6dfb-48f0-bf21-71dbfb2d3653): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.898726 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" podUID="1bab5bac-6dfb-48f0-bf21-71dbfb2d3653" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.915705 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv"] Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.920534 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16d3b481_106a_48ee_b99c_7a380086a9cd.slice/crio-33a78b40f714ffd1e728a549a4e9e528cfa3f8c8bbc80f0cc629a40565bce564 WatchSource:0}: Error finding container 33a78b40f714ffd1e728a549a4e9e528cfa3f8c8bbc80f0cc629a40565bce564: Status 404 returned error can't find the container with id 33a78b40f714ffd1e728a549a4e9e528cfa3f8c8bbc80f0cc629a40565bce564 Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.922453 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc"] Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.930371 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtfhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-778qv_openstack-operators(16d3b481-106a-48ee-b99c-7a380086a9cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.930596 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-r895x"] Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.931837 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" podUID="16d3b481-106a-48ee-b99c-7a380086a9cd" Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.932816 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7120e5_8e39_4664_9d63_beaea1ff4043.slice/crio-e878ff0442dbf8bc93ea2028ee57ffca832a5ca2b1d4107d14983f87c1826705 WatchSource:0}: Error finding container e878ff0442dbf8bc93ea2028ee57ffca832a5ca2b1d4107d14983f87c1826705: Status 404 returned error can't find the container with id e878ff0442dbf8bc93ea2028ee57ffca832a5ca2b1d4107d14983f87c1826705 Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.933360 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl"] Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.935330 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jtrz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-r895x_openstack-operators(1f7120e5-8e39-4664-9d63-beaea1ff4043): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.936462 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" podUID="1f7120e5-8e39-4664-9d63-beaea1ff4043" Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.936789 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3629416_c45e_46da_98ba_dfd8b6630abd.slice/crio-4c9c72ede6b39e91447226ae4d97a73cd300f94a22ef914ce5e3891761846f84 WatchSource:0}: Error finding container 4c9c72ede6b39e91447226ae4d97a73cd300f94a22ef914ce5e3891761846f84: Status 404 returned error can't find the container with id 4c9c72ede6b39e91447226ae4d97a73cd300f94a22ef914ce5e3891761846f84 Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.937232 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af2019b_e469_403f_8c3e_91006f2902ad.slice/crio-3f19c9db940d26e568665f8eb2b7ccafb5bed72be802670684c437403dc5360a WatchSource:0}: Error finding container 3f19c9db940d26e568665f8eb2b7ccafb5bed72be802670684c437403dc5360a: Status 404 returned error can't find the container with id 3f19c9db940d26e568665f8eb2b7ccafb5bed72be802670684c437403dc5360a Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.939151 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jx9zm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-zgrxl_openstack-operators(5af2019b-e469-403f-8c3e-91006f2902ad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.941107 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" podUID="5af2019b-e469-403f-8c3e-91006f2902ad" Jan 21 09:15:46 crc kubenswrapper[4618]: I0121 09:15:46.941234 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd"] Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.941327 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kqw74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-7nkmc_openstack-operators(b3629416-c45e-46da-98ba-dfd8b6630abd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.942560 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" podUID="b3629416-c45e-46da-98ba-dfd8b6630abd" Jan 21 09:15:46 crc kubenswrapper[4618]: W0121 09:15:46.943116 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f5bddf_5e04_4510_903b_6861f19fa87b.slice/crio-c2112ae656fa55e5e6e350ddd1d9f39ac7eec313819b906830e5e7e3e60f34f8 WatchSource:0}: Error finding container c2112ae656fa55e5e6e350ddd1d9f39ac7eec313819b906830e5e7e3e60f34f8: Status 404 returned error can't find the container with id c2112ae656fa55e5e6e350ddd1d9f39ac7eec313819b906830e5e7e3e60f34f8 Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.945738 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sf986,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-g4khd_openstack-operators(e4f5bddf-5e04-4510-903b-6861f19fa87b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 09:15:46 crc kubenswrapper[4618]: E0121 09:15:46.946852 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" podUID="e4f5bddf-5e04-4510-903b-6861f19fa87b" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.024969 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6"] Jan 21 09:15:47 crc kubenswrapper[4618]: W0121 09:15:47.026488 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010792a0_26fd_456a_9186_79799c9a511e.slice/crio-08d6b28d865ed38190d4ae5534fa5c0e89bd80508daf943d6b298d5dd848e9c1 WatchSource:0}: Error finding container 08d6b28d865ed38190d4ae5534fa5c0e89bd80508daf943d6b298d5dd848e9c1: Status 404 returned error can't find the container with id 08d6b28d865ed38190d4ae5534fa5c0e89bd80508daf943d6b298d5dd848e9c1 Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.255386 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.255513 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.255561 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:49.255550144 +0000 UTC m=+748.006017461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.356932 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.357108 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.357189 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:49.35717311 +0000 UTC m=+748.107640437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.862178 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" event={"ID":"61c3771f-ea2c-4307-8d5b-7f44194235cd","Type":"ContainerStarted","Data":"353a040b21445b73925b01a6f4752d63e677eb545650e95a0b354ed376a51564"} Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.862382 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.862467 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.862488 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.862561 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:49.862548091 +0000 UTC m=+748.613015409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.862644 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.862691 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:49.862677946 +0000 UTC m=+748.613145263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.863901 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" event={"ID":"16d3b481-106a-48ee-b99c-7a380086a9cd","Type":"ContainerStarted","Data":"33a78b40f714ffd1e728a549a4e9e528cfa3f8c8bbc80f0cc629a40565bce564"} Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.864775 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" event={"ID":"010792a0-26fd-456a-9186-79799c9a511e","Type":"ContainerStarted","Data":"08d6b28d865ed38190d4ae5534fa5c0e89bd80508daf943d6b298d5dd848e9c1"} Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.866222 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" event={"ID":"e4f5bddf-5e04-4510-903b-6861f19fa87b","Type":"ContainerStarted","Data":"c2112ae656fa55e5e6e350ddd1d9f39ac7eec313819b906830e5e7e3e60f34f8"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.867589 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" podUID="e4f5bddf-5e04-4510-903b-6861f19fa87b" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.867960 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" event={"ID":"14908c8c-b444-4359-9e3a-e0fcc443e9f7","Type":"ContainerStarted","Data":"e30af55a7027056dfa8c164e3bdb6a680059dec32d56733b7db13b0f5c4c239b"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.868074 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" podUID="16d3b481-106a-48ee-b99c-7a380086a9cd" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.871920 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" event={"ID":"1f7120e5-8e39-4664-9d63-beaea1ff4043","Type":"ContainerStarted","Data":"e878ff0442dbf8bc93ea2028ee57ffca832a5ca2b1d4107d14983f87c1826705"} Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.873425 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" event={"ID":"5af2019b-e469-403f-8c3e-91006f2902ad","Type":"ContainerStarted","Data":"3f19c9db940d26e568665f8eb2b7ccafb5bed72be802670684c437403dc5360a"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.874121 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" podUID="5af2019b-e469-403f-8c3e-91006f2902ad" Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.875488 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" podUID="1f7120e5-8e39-4664-9d63-beaea1ff4043" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.881085 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" event={"ID":"1739988f-1de9-4c68-85ac-c14971105314","Type":"ContainerStarted","Data":"59bb8e179d4d956e1a5418abfeadd9aecee27cf0402402a7cafdac227eae2af5"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.882011 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" podUID="1739988f-1de9-4c68-85ac-c14971105314" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.882775 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" event={"ID":"1bab5bac-6dfb-48f0-bf21-71dbfb2d3653","Type":"ContainerStarted","Data":"7a3a826416e31775c56452933088d43122d36b70c8a9f211a6d4e42cf7fde22b"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.883727 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" podUID="1bab5bac-6dfb-48f0-bf21-71dbfb2d3653" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.886980 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" event={"ID":"b3629416-c45e-46da-98ba-dfd8b6630abd","Type":"ContainerStarted","Data":"4c9c72ede6b39e91447226ae4d97a73cd300f94a22ef914ce5e3891761846f84"} Jan 21 09:15:47 crc kubenswrapper[4618]: E0121 09:15:47.889834 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" podUID="b3629416-c45e-46da-98ba-dfd8b6630abd" Jan 21 09:15:47 crc kubenswrapper[4618]: I0121 09:15:47.893460 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" event={"ID":"69396ad4-b4ad-4f43-a0f5-83b655e590da","Type":"ContainerStarted","Data":"54b51286f457f02c3776e85906a08e04da5deb53f958dace11ccf8372f03a20e"} Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903457 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" podUID="5af2019b-e469-403f-8c3e-91006f2902ad" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903464 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" podUID="1f7120e5-8e39-4664-9d63-beaea1ff4043" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903502 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" podUID="e4f5bddf-5e04-4510-903b-6861f19fa87b" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903551 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" podUID="16d3b481-106a-48ee-b99c-7a380086a9cd" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903551 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" podUID="1bab5bac-6dfb-48f0-bf21-71dbfb2d3653" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903556 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" podUID="b3629416-c45e-46da-98ba-dfd8b6630abd" Jan 21 09:15:48 crc kubenswrapper[4618]: E0121 09:15:48.903712 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" podUID="1739988f-1de9-4c68-85ac-c14971105314" Jan 21 09:15:49 crc kubenswrapper[4618]: I0121 09:15:49.283335 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.283562 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.283630 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:53.283614952 +0000 UTC m=+752.034082269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: I0121 09:15:49.384669 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.384802 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.384856 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:53.384843615 +0000 UTC m=+752.135310933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: I0121 09:15:49.524658 4618 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 09:15:49 crc kubenswrapper[4618]: I0121 09:15:49.901072 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:49 crc kubenswrapper[4618]: I0121 09:15:49.901124 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.901244 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.901291 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.901313 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:53.901288904 +0000 UTC m=+752.651756220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:15:49 crc kubenswrapper[4618]: E0121 09:15:49.901338 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:15:53.901325993 +0000 UTC m=+752.651793310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: I0121 09:15:53.352842 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.353099 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.353203 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:01.353182563 +0000 UTC m=+760.103649890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: I0121 09:15:53.454744 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.454979 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.455041 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:01.455027868 +0000 UTC m=+760.205495184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: I0121 09:15:53.961186 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:53 crc kubenswrapper[4618]: I0121 09:15:53.961231 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.961388 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.961427 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:01.961415355 +0000 UTC m=+760.711882672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.961701 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:15:53 crc kubenswrapper[4618]: E0121 09:15:53.961731 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:01.961723275 +0000 UTC m=+760.712190593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.937133 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" event={"ID":"d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4","Type":"ContainerStarted","Data":"e911989cf24ea7c223fe7862e3af7c2e9b3590f3eb6f2171b83fab7b4ccffdfb"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.938277 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" event={"ID":"f3975776-d0c3-478c-873c-349415bf2d3c","Type":"ContainerStarted","Data":"d8065cd50ad4f5a182226596f440dd9f9dc1495e6c971f207c44faecc7a05f8b"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.938391 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.939262 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" event={"ID":"80cee31f-467d-4c99-8b58-1edbee74f4a9","Type":"ContainerStarted","Data":"5517b3e3b182efc2ccd066721bb1a594609b44ad3871fadbf711d6c7c8ce3346"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.939392 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.940199 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" event={"ID":"0ec13d1d-fae7-4efd-92d6-0b93f972694f","Type":"ContainerStarted","Data":"1327e3ab87a5e95d34b8bf044803b2b31cb9446648940e5dadb25cf6646d27bd"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.940298 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.941041 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" event={"ID":"61c3771f-ea2c-4307-8d5b-7f44194235cd","Type":"ContainerStarted","Data":"7e9c23df0fe539645065eaeadc430ec7127d7c1e840ba9af911cb383015f04df"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.941179 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.941907 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" event={"ID":"69396ad4-b4ad-4f43-a0f5-83b655e590da","Type":"ContainerStarted","Data":"2f8cdad8969bb87f130cf06ea87e0bdb6b4b20e51698b982cd4645e8f348b5be"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.941962 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.942787 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" event={"ID":"010792a0-26fd-456a-9186-79799c9a511e","Type":"ContainerStarted","Data":"1b7799154c29f1080a1af38ba5fce4c36bf2f0a970e21a6225ba8bc727503439"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.942844 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.943899 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" event={"ID":"e0011800-e28a-4e71-8306-819d8d865dfe","Type":"ContainerStarted","Data":"b0fdb98707dfa1becc5543b192bbfd5da11e43635ccc39218648ea31f0240947"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.944003 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.944787 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" event={"ID":"14908c8c-b444-4359-9e3a-e0fcc443e9f7","Type":"ContainerStarted","Data":"845676c88465c759aef710a0e6146d9dd87110cd136af9e7aa9896144e4317f3"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.944827 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.945740 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" event={"ID":"276f144f-a185-46da-a3af-f0aa8a9eaaad","Type":"ContainerStarted","Data":"f2db0edc99b202f66bf3bcf37ed284b3ede1191632808f92b392cebfba3b8fda"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.945839 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.946612 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" event={"ID":"0ff11d9c-92c7-4b78-8336-70e117f63880","Type":"ContainerStarted","Data":"5d3501fc681a937ee145a694ef5c10fe4174f1b80e69997f708615d18b38eeb3"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.946656 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.947487 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" event={"ID":"982d4204-447a-43c3-858e-c16cceebf1bb","Type":"ContainerStarted","Data":"36d8b8a35d6fb0672fcbe6040f21f541cbcf1d75c9afcf2967bcd53b722e91ec"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.947636 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.948319 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" event={"ID":"f0bde946-f6c9-45a5-a124-6cf62551f0bc","Type":"ContainerStarted","Data":"7545e2577f47f373f7bd611ee2bd54ed1c88923282ca22badfe0cf00ce99b21e"} Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.948432 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:15:55 crc kubenswrapper[4618]: I0121 09:15:55.954566 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" podStartSLOduration=1.929809493 podStartE2EDuration="10.95455702s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.520004409 +0000 UTC m=+745.270471725" lastFinishedPulling="2026-01-21 09:15:55.544751935 +0000 UTC m=+754.295219252" observedRunningTime="2026-01-21 09:15:55.951113294 +0000 UTC m=+754.701580612" watchObservedRunningTime="2026-01-21 09:15:55.95455702 +0000 UTC m=+754.705024337" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.006411 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" podStartSLOduration=2.19018745 podStartE2EDuration="11.006396849s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.746510171 +0000 UTC m=+745.496977488" lastFinishedPulling="2026-01-21 09:15:55.56271957 +0000 UTC m=+754.313186887" observedRunningTime="2026-01-21 09:15:55.975964181 +0000 UTC m=+754.726431498" watchObservedRunningTime="2026-01-21 09:15:56.006396849 +0000 UTC m=+754.756864166" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.006744 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" podStartSLOduration=2.21342062 podStartE2EDuration="11.006740857s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.751384601 +0000 UTC m=+745.501851907" lastFinishedPulling="2026-01-21 09:15:55.544704827 +0000 UTC m=+754.295172144" observedRunningTime="2026-01-21 09:15:55.994271064 +0000 UTC m=+754.744738382" watchObservedRunningTime="2026-01-21 09:15:56.006740857 +0000 UTC m=+754.757208174" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.048412 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" podStartSLOduration=2.243918481 podStartE2EDuration="11.048395647s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.746219043 +0000 UTC m=+745.496686360" lastFinishedPulling="2026-01-21 09:15:55.550696209 +0000 UTC m=+754.301163526" observedRunningTime="2026-01-21 09:15:56.046665038 +0000 UTC m=+754.797132355" watchObservedRunningTime="2026-01-21 09:15:56.048395647 +0000 UTC m=+754.798862964" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.048484 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" podStartSLOduration=2.287188902 podStartE2EDuration="11.048480185s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.835755039 +0000 UTC m=+745.586222355" lastFinishedPulling="2026-01-21 09:15:55.597046331 +0000 UTC m=+754.347513638" observedRunningTime="2026-01-21 09:15:56.01491552 +0000 UTC m=+754.765382837" watchObservedRunningTime="2026-01-21 09:15:56.048480185 +0000 UTC m=+754.798947503" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.119918 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" podStartSLOduration=1.594555273 podStartE2EDuration="10.119898774s" podCreationTimestamp="2026-01-21 09:15:46 +0000 UTC" firstStartedPulling="2026-01-21 09:15:47.028253452 +0000 UTC m=+745.778720768" lastFinishedPulling="2026-01-21 09:15:55.553596951 +0000 UTC m=+754.304064269" observedRunningTime="2026-01-21 09:15:56.105079548 +0000 UTC m=+754.855546866" watchObservedRunningTime="2026-01-21 09:15:56.119898774 +0000 UTC m=+754.870366091" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.121027 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" podStartSLOduration=2.111947256 podStartE2EDuration="11.121017993s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.552482197 +0000 UTC m=+745.302949514" lastFinishedPulling="2026-01-21 09:15:55.561552933 +0000 UTC m=+754.312020251" observedRunningTime="2026-01-21 09:15:56.079603474 +0000 UTC m=+754.830070791" watchObservedRunningTime="2026-01-21 09:15:56.121017993 +0000 UTC m=+754.871485389" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.203918 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" podStartSLOduration=2.216455216 podStartE2EDuration="11.203902041s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.517347034 +0000 UTC m=+745.267814352" lastFinishedPulling="2026-01-21 09:15:55.504793861 +0000 UTC m=+754.255261177" observedRunningTime="2026-01-21 09:15:56.166469191 +0000 UTC m=+754.916936509" watchObservedRunningTime="2026-01-21 09:15:56.203902041 +0000 UTC m=+754.954369358" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.246437 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" podStartSLOduration=2.346252896 podStartE2EDuration="11.246421018s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.638230119 +0000 UTC m=+745.388697436" lastFinishedPulling="2026-01-21 09:15:55.538398241 +0000 UTC m=+754.288865558" observedRunningTime="2026-01-21 09:15:56.207111795 +0000 UTC m=+754.957579111" watchObservedRunningTime="2026-01-21 09:15:56.246421018 +0000 UTC m=+754.996888335" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.285377 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" podStartSLOduration=2.51432854 podStartE2EDuration="11.285363731s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.348308608 +0000 UTC m=+745.098775925" lastFinishedPulling="2026-01-21 09:15:55.119343799 +0000 UTC m=+753.869811116" observedRunningTime="2026-01-21 09:15:56.247466397 +0000 UTC m=+754.997933714" watchObservedRunningTime="2026-01-21 09:15:56.285363731 +0000 UTC m=+755.035831048" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.285449 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" podStartSLOduration=2.615568275 podStartE2EDuration="11.285445626s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.840695311 +0000 UTC m=+745.591162628" lastFinishedPulling="2026-01-21 09:15:55.510572672 +0000 UTC m=+754.261039979" observedRunningTime="2026-01-21 09:15:56.283264569 +0000 UTC m=+755.033731886" watchObservedRunningTime="2026-01-21 09:15:56.285445626 +0000 UTC m=+755.035912942" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.363276 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" podStartSLOduration=2.594166542 podStartE2EDuration="11.36326143s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.835471825 +0000 UTC m=+745.585939142" lastFinishedPulling="2026-01-21 09:15:55.604566712 +0000 UTC m=+754.355034030" observedRunningTime="2026-01-21 09:15:56.324535596 +0000 UTC m=+755.075002912" watchObservedRunningTime="2026-01-21 09:15:56.36326143 +0000 UTC m=+755.113728748" Jan 21 09:15:56 crc kubenswrapper[4618]: I0121 09:15:56.953679 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:16:01 crc kubenswrapper[4618]: I0121 09:16:01.444286 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:01 crc kubenswrapper[4618]: E0121 09:16:01.444421 4618 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 09:16:01 crc kubenswrapper[4618]: E0121 09:16:01.444810 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert podName:cad4873a-5a2e-40ea-a4b1-3173e8138be0 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:17.444795183 +0000 UTC m=+776.195262501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert") pod "infra-operator-controller-manager-77c48c7859-dsjzx" (UID: "cad4873a-5a2e-40ea-a4b1-3173e8138be0") : secret "infra-operator-webhook-server-cert" not found Jan 21 09:16:01 crc kubenswrapper[4618]: I0121 09:16:01.546134 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:01 crc kubenswrapper[4618]: E0121 09:16:01.546217 4618 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:16:01 crc kubenswrapper[4618]: E0121 09:16:01.546260 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert podName:b662a5ae-39f6-4592-baf2-efa15f7c82b0 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:17.546249312 +0000 UTC m=+776.296716629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" (UID: "b662a5ae-39f6-4592-baf2-efa15f7c82b0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 09:16:02 crc kubenswrapper[4618]: I0121 09:16:02.052150 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:02 crc kubenswrapper[4618]: I0121 09:16:02.052193 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:02 crc kubenswrapper[4618]: E0121 09:16:02.052315 4618 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 09:16:02 crc kubenswrapper[4618]: E0121 09:16:02.052333 4618 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 09:16:02 crc kubenswrapper[4618]: E0121 09:16:02.052373 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:18.052362644 +0000 UTC m=+776.802829961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "webhook-server-cert" not found Jan 21 09:16:02 crc kubenswrapper[4618]: E0121 09:16:02.052394 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs podName:cfa3b66e-c251-46f7-ade1-edd4df56db67 nodeName:}" failed. No retries permitted until 2026-01-21 09:16:18.052388853 +0000 UTC m=+776.802856171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-42lr9" (UID: "cfa3b66e-c251-46f7-ade1-edd4df56db67") : secret "metrics-server-cert" not found Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.774006 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6j9f2" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.786553 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6zn64" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.792586 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" podStartSLOduration=11.942965051 podStartE2EDuration="20.792578202s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.68545574 +0000 UTC m=+745.435923057" lastFinishedPulling="2026-01-21 09:15:55.535068891 +0000 UTC m=+754.285536208" observedRunningTime="2026-01-21 09:15:56.365367847 +0000 UTC m=+755.115835163" watchObservedRunningTime="2026-01-21 09:16:05.792578202 +0000 UTC m=+764.543045519" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.800433 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-nf54z" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.809960 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4r2qm" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.861413 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-ms7zc" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.879939 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-bd65l" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.900588 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-g58rl" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.915382 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-l55q5" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.926672 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-djc75" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.938971 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-lsgpp" Jan 21 09:16:05 crc kubenswrapper[4618]: I0121 09:16:05.944598 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-5m9wn" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.005389 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" event={"ID":"1bab5bac-6dfb-48f0-bf21-71dbfb2d3653","Type":"ContainerStarted","Data":"1f87942ef59816bac19280e93200120d885ed9bc65b74e3c85672d3b89bee27e"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.007032 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" event={"ID":"b3629416-c45e-46da-98ba-dfd8b6630abd","Type":"ContainerStarted","Data":"13916ce87f7d86225e3d0fe56f84f7a2bab3dbf5788fa3ba0ca7ddbbdb344518"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.007186 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.008063 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" event={"ID":"16d3b481-106a-48ee-b99c-7a380086a9cd","Type":"ContainerStarted","Data":"de5fdfd47480b064a0b408810f5d75b1d4cbb860d228d0c92ccf1db6aace0de8"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.008305 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.009098 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" event={"ID":"e4f5bddf-5e04-4510-903b-6861f19fa87b","Type":"ContainerStarted","Data":"23d2981577cbbadb5393a28658be980216d396811239ee2b4a1bf5c2841cf513"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.009435 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.010298 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" event={"ID":"1f7120e5-8e39-4664-9d63-beaea1ff4043","Type":"ContainerStarted","Data":"d8dc54f0cebc3294d85de855d5abcfb71022d9bae950f748a937a7ddfb76ffd5"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.010606 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.016365 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" event={"ID":"5af2019b-e469-403f-8c3e-91006f2902ad","Type":"ContainerStarted","Data":"c44d3362fb58f35d8b331e2d41aedf44ecd384ab3c448f8e5f49ec2763c9721b"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.016724 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.016906 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-j5xjz" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.029535 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-9nmj5" podStartSLOduration=1.9929818799999999 podStartE2EDuration="20.029525807s" podCreationTimestamp="2026-01-21 09:15:46 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.89720827 +0000 UTC m=+745.647675587" lastFinishedPulling="2026-01-21 09:16:04.933752197 +0000 UTC m=+763.684219514" observedRunningTime="2026-01-21 09:16:06.027437576 +0000 UTC m=+764.777904893" watchObservedRunningTime="2026-01-21 09:16:06.029525807 +0000 UTC m=+764.779993124" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.029723 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" event={"ID":"1739988f-1de9-4c68-85ac-c14971105314","Type":"ContainerStarted","Data":"f5f9b288140785cdc92e525d7a0cbb4cba1d7f407a33383f86e3b9f7fb3c6d53"} Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.029980 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.045705 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" podStartSLOduration=2.625465282 podStartE2EDuration="21.045690909s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.938976805 +0000 UTC m=+745.689444122" lastFinishedPulling="2026-01-21 09:16:05.359136759 +0000 UTC m=+764.109669749" observedRunningTime="2026-01-21 09:16:06.042104565 +0000 UTC m=+764.792571882" watchObservedRunningTime="2026-01-21 09:16:06.045690909 +0000 UTC m=+764.796158226" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.065469 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" podStartSLOduration=2.627326105 podStartE2EDuration="21.065456669s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.94125281 +0000 UTC m=+745.691720128" lastFinishedPulling="2026-01-21 09:16:05.379383375 +0000 UTC m=+764.129850692" observedRunningTime="2026-01-21 09:16:06.063260934 +0000 UTC m=+764.813728252" watchObservedRunningTime="2026-01-21 09:16:06.065456669 +0000 UTC m=+764.815923986" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.073773 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" podStartSLOduration=2.642786809 podStartE2EDuration="21.073759355s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.94562253 +0000 UTC m=+745.696089846" lastFinishedPulling="2026-01-21 09:16:05.376595074 +0000 UTC m=+764.127062392" observedRunningTime="2026-01-21 09:16:06.072663741 +0000 UTC m=+764.823131058" watchObservedRunningTime="2026-01-21 09:16:06.073759355 +0000 UTC m=+764.824226671" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.091665 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" podStartSLOduration=3.788666815 podStartE2EDuration="21.091652598s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.930267234 +0000 UTC m=+745.680734551" lastFinishedPulling="2026-01-21 09:16:04.233253017 +0000 UTC m=+762.983720334" observedRunningTime="2026-01-21 09:16:06.087652135 +0000 UTC m=+764.838119453" watchObservedRunningTime="2026-01-21 09:16:06.091652598 +0000 UTC m=+764.842119916" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.101916 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" podStartSLOduration=2.657734267 podStartE2EDuration="21.101903692s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.935243404 +0000 UTC m=+745.685710722" lastFinishedPulling="2026-01-21 09:16:05.379412831 +0000 UTC m=+764.129880147" observedRunningTime="2026-01-21 09:16:06.100350929 +0000 UTC m=+764.850818246" watchObservedRunningTime="2026-01-21 09:16:06.101903692 +0000 UTC m=+764.852371009" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.130857 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" podStartSLOduration=2.616551866 podStartE2EDuration="21.130842347s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:15:46.843675072 +0000 UTC m=+745.594142389" lastFinishedPulling="2026-01-21 09:16:05.357965552 +0000 UTC m=+764.108432870" observedRunningTime="2026-01-21 09:16:06.12734981 +0000 UTC m=+764.877817127" watchObservedRunningTime="2026-01-21 09:16:06.130842347 +0000 UTC m=+764.881309664" Jan 21 09:16:06 crc kubenswrapper[4618]: I0121 09:16:06.404696 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-czzg6" Jan 21 09:16:15 crc kubenswrapper[4618]: I0121 09:16:15.981427 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-cmhx4" Jan 21 09:16:16 crc kubenswrapper[4618]: I0121 09:16:16.003627 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-7nkmc" Jan 21 09:16:16 crc kubenswrapper[4618]: I0121 09:16:16.049759 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-r895x" Jan 21 09:16:16 crc kubenswrapper[4618]: I0121 09:16:16.102860 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-zgrxl" Jan 21 09:16:16 crc kubenswrapper[4618]: I0121 09:16:16.208669 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-778qv" Jan 21 09:16:16 crc kubenswrapper[4618]: I0121 09:16:16.272878 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-g4khd" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.521042 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.525445 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad4873a-5a2e-40ea-a4b1-3173e8138be0-cert\") pod \"infra-operator-controller-manager-77c48c7859-dsjzx\" (UID: \"cad4873a-5a2e-40ea-a4b1-3173e8138be0\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.622284 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.624821 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b662a5ae-39f6-4592-baf2-efa15f7c82b0-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986dvc9c5\" (UID: \"b662a5ae-39f6-4592-baf2-efa15f7c82b0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.689632 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:17 crc kubenswrapper[4618]: I0121 09:16:17.811846 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.023586 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx"] Jan 21 09:16:18 crc kubenswrapper[4618]: W0121 09:16:18.026194 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad4873a_5a2e_40ea_a4b1_3173e8138be0.slice/crio-afd1c38b5bcdd236f345d5baca33cb7806bd3ef1a62dbfcf32664a1c3c09ceaf WatchSource:0}: Error finding container afd1c38b5bcdd236f345d5baca33cb7806bd3ef1a62dbfcf32664a1c3c09ceaf: Status 404 returned error can't find the container with id afd1c38b5bcdd236f345d5baca33cb7806bd3ef1a62dbfcf32664a1c3c09ceaf Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.084896 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" event={"ID":"cad4873a-5a2e-40ea-a4b1-3173e8138be0","Type":"ContainerStarted","Data":"afd1c38b5bcdd236f345d5baca33cb7806bd3ef1a62dbfcf32664a1c3c09ceaf"} Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.128825 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.129181 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.132855 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.133955 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa3b66e-c251-46f7-ade1-edd4df56db67-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-42lr9\" (UID: \"cfa3b66e-c251-46f7-ade1-edd4df56db67\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.157587 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5"] Jan 21 09:16:18 crc kubenswrapper[4618]: W0121 09:16:18.159752 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb662a5ae_39f6_4592_baf2_efa15f7c82b0.slice/crio-1fac600968232e847159b6e44ed2e11da3a9195a342bdc3d48cf85d2ff34e971 WatchSource:0}: Error finding container 1fac600968232e847159b6e44ed2e11da3a9195a342bdc3d48cf85d2ff34e971: Status 404 returned error can't find the container with id 1fac600968232e847159b6e44ed2e11da3a9195a342bdc3d48cf85d2ff34e971 Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.432893 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:18 crc kubenswrapper[4618]: I0121 09:16:18.776718 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9"] Jan 21 09:16:18 crc kubenswrapper[4618]: W0121 09:16:18.778693 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa3b66e_c251_46f7_ade1_edd4df56db67.slice/crio-ac1b3f090aab863f43bde41c3a0630b24759980aa51d80ea62ed45e54bddff1b WatchSource:0}: Error finding container ac1b3f090aab863f43bde41c3a0630b24759980aa51d80ea62ed45e54bddff1b: Status 404 returned error can't find the container with id ac1b3f090aab863f43bde41c3a0630b24759980aa51d80ea62ed45e54bddff1b Jan 21 09:16:19 crc kubenswrapper[4618]: I0121 09:16:19.095610 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" event={"ID":"cfa3b66e-c251-46f7-ade1-edd4df56db67","Type":"ContainerStarted","Data":"2b3949e34364e546df1ccc08f7cebc90560eb3cc9ac74006be55bb8472673488"} Jan 21 09:16:19 crc kubenswrapper[4618]: I0121 09:16:19.095655 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" event={"ID":"cfa3b66e-c251-46f7-ade1-edd4df56db67","Type":"ContainerStarted","Data":"ac1b3f090aab863f43bde41c3a0630b24759980aa51d80ea62ed45e54bddff1b"} Jan 21 09:16:19 crc kubenswrapper[4618]: I0121 09:16:19.095813 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:19 crc kubenswrapper[4618]: I0121 09:16:19.097579 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" event={"ID":"b662a5ae-39f6-4592-baf2-efa15f7c82b0","Type":"ContainerStarted","Data":"1fac600968232e847159b6e44ed2e11da3a9195a342bdc3d48cf85d2ff34e971"} Jan 21 09:16:19 crc kubenswrapper[4618]: I0121 09:16:19.120223 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" podStartSLOduration=33.120207796 podStartE2EDuration="33.120207796s" podCreationTimestamp="2026-01-21 09:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:16:19.117338141 +0000 UTC m=+777.867805459" watchObservedRunningTime="2026-01-21 09:16:19.120207796 +0000 UTC m=+777.870675112" Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.107480 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" event={"ID":"b662a5ae-39f6-4592-baf2-efa15f7c82b0","Type":"ContainerStarted","Data":"acc8a0ceab05bb3cfad486ae05480aa6829ec77df74e0a0c6d8c7b26e8cdd5b2"} Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.108388 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.108668 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" event={"ID":"cad4873a-5a2e-40ea-a4b1-3173e8138be0","Type":"ContainerStarted","Data":"e74df4ae4e7717544cd7bc6fba5fad94f452a744c8d32db413cba8c72c75faff"} Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.108854 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.131726 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" podStartSLOduration=34.130588122 podStartE2EDuration="36.131712488s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:16:18.161374807 +0000 UTC m=+776.911842124" lastFinishedPulling="2026-01-21 09:16:20.162499174 +0000 UTC m=+778.912966490" observedRunningTime="2026-01-21 09:16:21.126480927 +0000 UTC m=+779.876948244" watchObservedRunningTime="2026-01-21 09:16:21.131712488 +0000 UTC m=+779.882179805" Jan 21 09:16:21 crc kubenswrapper[4618]: I0121 09:16:21.139982 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" podStartSLOduration=34.005756684 podStartE2EDuration="36.139969968s" podCreationTimestamp="2026-01-21 09:15:45 +0000 UTC" firstStartedPulling="2026-01-21 09:16:18.027904771 +0000 UTC m=+776.778372088" lastFinishedPulling="2026-01-21 09:16:20.162118056 +0000 UTC m=+778.912585372" observedRunningTime="2026-01-21 09:16:21.13599825 +0000 UTC m=+779.886465566" watchObservedRunningTime="2026-01-21 09:16:21.139969968 +0000 UTC m=+779.890437285" Jan 21 09:16:26 crc kubenswrapper[4618]: I0121 09:16:26.958776 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:16:26 crc kubenswrapper[4618]: I0121 09:16:26.959020 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:16:27 crc kubenswrapper[4618]: I0121 09:16:27.695278 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-dsjzx" Jan 21 09:16:27 crc kubenswrapper[4618]: I0121 09:16:27.816636 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986dvc9c5" Jan 21 09:16:28 crc kubenswrapper[4618]: I0121 09:16:28.437767 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-42lr9" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.591126 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.592649 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.593833 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xgk7m" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.594031 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.594465 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.594680 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.604359 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.636314 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.637339 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.638785 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.651322 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.702242 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.702307 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.702392 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.702416 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.702447 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllcd\" (UniqueName: \"kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.803415 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.803462 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.803491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kllcd\" (UniqueName: \"kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.803519 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.803547 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.804326 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.804377 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.804394 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.819415 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75\") pod \"dnsmasq-dns-84bb9d8bd9-dp22c\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.819464 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllcd\" (UniqueName: \"kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd\") pod \"dnsmasq-dns-5f854695bc-b6px4\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.909905 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:44 crc kubenswrapper[4618]: I0121 09:16:44.951567 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:45 crc kubenswrapper[4618]: I0121 09:16:45.261153 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:45 crc kubenswrapper[4618]: I0121 09:16:45.311051 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:45 crc kubenswrapper[4618]: W0121 09:16:45.311943 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf971f6d0_ae98_4309_a547_981c6960e545.slice/crio-b32854286e52fdc146ceb2c1e0d7a986d75b673347d4a2aa833b668bf6f7bded WatchSource:0}: Error finding container b32854286e52fdc146ceb2c1e0d7a986d75b673347d4a2aa833b668bf6f7bded: Status 404 returned error can't find the container with id b32854286e52fdc146ceb2c1e0d7a986d75b673347d4a2aa833b668bf6f7bded Jan 21 09:16:46 crc kubenswrapper[4618]: I0121 09:16:46.221117 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" event={"ID":"f971f6d0-ae98-4309-a547-981c6960e545","Type":"ContainerStarted","Data":"b32854286e52fdc146ceb2c1e0d7a986d75b673347d4a2aa833b668bf6f7bded"} Jan 21 09:16:46 crc kubenswrapper[4618]: I0121 09:16:46.222031 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" event={"ID":"3a46fa0b-7185-4171-94e8-b997c2abfa30","Type":"ContainerStarted","Data":"d2fbacf3415929edf56b0e7616a150399c3f9d8659557f134baaac924244f844"} Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.410694 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.425827 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.426885 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.431607 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.542484 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhjxw\" (UniqueName: \"kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.542732 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.542767 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.644660 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.644717 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.644750 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhjxw\" (UniqueName: \"kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.645776 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.646439 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.661870 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhjxw\" (UniqueName: \"kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw\") pod \"dnsmasq-dns-744ffd65bc-fcjz8\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.687511 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.702799 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.703987 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.710787 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.746881 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.746966 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthwg\" (UniqueName: \"kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.747552 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.749274 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.848408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.848503 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthwg\" (UniqueName: \"kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.848585 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.849221 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.849309 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:47 crc kubenswrapper[4618]: I0121 09:16:47.862975 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthwg\" (UniqueName: \"kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg\") pod \"dnsmasq-dns-95f5f6995-z8fw4\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.017561 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.567378 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.568917 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.573229 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.573413 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.573637 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.573750 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.573932 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.574117 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-677wz" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.574329 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.578957 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658568 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658785 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxtg\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658807 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658857 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658876 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658912 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658939 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658967 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.658985 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.659004 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.659031 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.762485 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.762576 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.762650 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxtg\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.762905 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.762948 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.763000 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.763021 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.763054 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.763087 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.764907 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.765159 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.765222 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.765277 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.767849 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.767889 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.768002 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.768251 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.772233 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.772239 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.774282 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.774319 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.777605 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxtg\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.795878 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.804655 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.805790 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.811301 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.811488 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.811657 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.811841 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.812802 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2sgc6" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.814255 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.814369 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.821151 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866409 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866440 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866465 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866495 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866510 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866605 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866634 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866665 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866714 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866759 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.866773 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7wv7\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.888724 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968566 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968613 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968634 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968651 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968670 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968702 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968736 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968749 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7wv7\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968827 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968844 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.968867 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.969279 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.969427 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.971591 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.972543 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.973453 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.973675 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.974990 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.976819 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.977634 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.979362 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.985692 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7wv7\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:48 crc kubenswrapper[4618]: I0121 09:16:48.989294 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.136081 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.974769 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.977407 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.979963 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-k68ds" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.981103 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.982959 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.983293 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.984695 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 21 09:16:49 crc kubenswrapper[4618]: I0121 09:16:49.985621 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.084889 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.084928 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085164 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-kolla-config\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085269 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rp2\" (UniqueName: \"kubernetes.io/projected/33a99731-3bad-4a35-97bc-2431645071bb-kube-api-access-d8rp2\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085301 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085318 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33a99731-3bad-4a35-97bc-2431645071bb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085435 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.085525 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-config-data-default\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186476 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rp2\" (UniqueName: \"kubernetes.io/projected/33a99731-3bad-4a35-97bc-2431645071bb-kube-api-access-d8rp2\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186523 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186541 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33a99731-3bad-4a35-97bc-2431645071bb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186573 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186616 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-config-data-default\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186676 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186703 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.186727 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-kolla-config\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.187333 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/33a99731-3bad-4a35-97bc-2431645071bb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.187988 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-config-data-default\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.188286 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.188446 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.189070 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33a99731-3bad-4a35-97bc-2431645071bb-kolla-config\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.191655 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.192101 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a99731-3bad-4a35-97bc-2431645071bb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.201980 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rp2\" (UniqueName: \"kubernetes.io/projected/33a99731-3bad-4a35-97bc-2431645071bb-kube-api-access-d8rp2\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.202990 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"33a99731-3bad-4a35-97bc-2431645071bb\") " pod="openstack/openstack-galera-0" Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.278805 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:16:50 crc kubenswrapper[4618]: I0121 09:16:50.296030 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.413480 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.414960 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.417195 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7wd5g" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.417463 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.417687 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.418167 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.421015 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.502803 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmx5\" (UniqueName: \"kubernetes.io/projected/424be4c5-5cc7-4641-b497-f01556c3d8ea-kube-api-access-npmx5\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.502835 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.502884 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.502962 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.503086 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.503267 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.503306 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.503421 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604742 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604816 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604837 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604891 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604913 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmx5\" (UniqueName: \"kubernetes.io/projected/424be4c5-5cc7-4641-b497-f01556c3d8ea-kube-api-access-npmx5\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604928 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604943 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.604965 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.605502 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.606049 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.608396 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/424be4c5-5cc7-4641-b497-f01556c3d8ea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.608482 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.611863 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/424be4c5-5cc7-4641-b497-f01556c3d8ea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.612620 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.612771 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424be4c5-5cc7-4641-b497-f01556c3d8ea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.621683 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmx5\" (UniqueName: \"kubernetes.io/projected/424be4c5-5cc7-4641-b497-f01556c3d8ea-kube-api-access-npmx5\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.624594 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"424be4c5-5cc7-4641-b497-f01556c3d8ea\") " pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.734251 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.845314 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.846244 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.848221 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.848290 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wqmkm" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.849041 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.854308 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.908338 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.908404 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-kolla-config\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.908492 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.908616 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-config-data\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:51 crc kubenswrapper[4618]: I0121 09:16:51.908750 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghnx\" (UniqueName: \"kubernetes.io/projected/f0b3230e-10e8-4707-8944-b59b1870a4fc-kube-api-access-7ghnx\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.010347 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.010382 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-kolla-config\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.010425 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.010459 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-config-data\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.010516 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghnx\" (UniqueName: \"kubernetes.io/projected/f0b3230e-10e8-4707-8944-b59b1870a4fc-kube-api-access-7ghnx\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.011132 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-kolla-config\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.011264 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0b3230e-10e8-4707-8944-b59b1870a4fc-config-data\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.013447 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.014736 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b3230e-10e8-4707-8944-b59b1870a4fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.023114 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghnx\" (UniqueName: \"kubernetes.io/projected/f0b3230e-10e8-4707-8944-b59b1870a4fc-kube-api-access-7ghnx\") pod \"memcached-0\" (UID: \"f0b3230e-10e8-4707-8944-b59b1870a4fc\") " pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.162857 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 09:16:52 crc kubenswrapper[4618]: I0121 09:16:52.256467 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" event={"ID":"aa6809e9-cec7-46ff-983a-2ae596e84add","Type":"ContainerStarted","Data":"0a167182c35c73fe86de338cdd40346720ce32b5ab5cf5b40ec93f7620253d7a"} Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.663847 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.665489 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.671212 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-f9lpg" Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.703356 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.730818 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6dqq\" (UniqueName: \"kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq\") pod \"kube-state-metrics-0\" (UID: \"c74af7ae-2eee-4b63-8515-0230ddf143c8\") " pod="openstack/kube-state-metrics-0" Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.831826 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6dqq\" (UniqueName: \"kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq\") pod \"kube-state-metrics-0\" (UID: \"c74af7ae-2eee-4b63-8515-0230ddf143c8\") " pod="openstack/kube-state-metrics-0" Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.854798 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6dqq\" (UniqueName: \"kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq\") pod \"kube-state-metrics-0\" (UID: \"c74af7ae-2eee-4b63-8515-0230ddf143c8\") " pod="openstack/kube-state-metrics-0" Jan 21 09:16:53 crc kubenswrapper[4618]: I0121 09:16:53.981046 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.847615 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.867005 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.948068 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:16:55 crc kubenswrapper[4618]: W0121 09:16:55.949117 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8d0c9b_9097_462d_904e_7ff5126b1056.slice/crio-a220c0767364c77f9016a122d039a01b5ea44cbef776b971e42c437447a464e5 WatchSource:0}: Error finding container a220c0767364c77f9016a122d039a01b5ea44cbef776b971e42c437447a464e5: Status 404 returned error can't find the container with id a220c0767364c77f9016a122d039a01b5ea44cbef776b971e42c437447a464e5 Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.985498 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 09:16:55 crc kubenswrapper[4618]: W0121 09:16:55.986626 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc74af7ae_2eee_4b63_8515_0230ddf143c8.slice/crio-07c40f291cfd3f5d0a4e9d742fd8d51723ed37a772932f5a5a4e041ea0136833 WatchSource:0}: Error finding container 07c40f291cfd3f5d0a4e9d742fd8d51723ed37a772932f5a5a4e041ea0136833: Status 404 returned error can't find the container with id 07c40f291cfd3f5d0a4e9d742fd8d51723ed37a772932f5a5a4e041ea0136833 Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.990503 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:16:55 crc kubenswrapper[4618]: W0121 09:16:55.992732 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33a99731_3bad_4a35_97bc_2431645071bb.slice/crio-f85a1c48a3ed65c3f7bf1db1e28188a6c8bc0165a92b4147746817d6fcff27d5 WatchSource:0}: Error finding container f85a1c48a3ed65c3f7bf1db1e28188a6c8bc0165a92b4147746817d6fcff27d5: Status 404 returned error can't find the container with id f85a1c48a3ed65c3f7bf1db1e28188a6c8bc0165a92b4147746817d6fcff27d5 Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.994137 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 09:16:55 crc kubenswrapper[4618]: W0121 09:16:55.994527 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod424be4c5_5cc7_4641_b497_f01556c3d8ea.slice/crio-f26dfe3226621b414640b4bfba542cc5ad35eb6efa5a1736bfc66f2dec1aac19 WatchSource:0}: Error finding container f26dfe3226621b414640b4bfba542cc5ad35eb6efa5a1736bfc66f2dec1aac19: Status 404 returned error can't find the container with id f26dfe3226621b414640b4bfba542cc5ad35eb6efa5a1736bfc66f2dec1aac19 Jan 21 09:16:55 crc kubenswrapper[4618]: W0121 09:16:55.996284 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b3230e_10e8_4707_8944_b59b1870a4fc.slice/crio-04f8ca44b3fac8c9b793a0def7e5b40f5ee0950adba5be4e9667fad64257fea8 WatchSource:0}: Error finding container 04f8ca44b3fac8c9b793a0def7e5b40f5ee0950adba5be4e9667fad64257fea8: Status 404 returned error can't find the container with id 04f8ca44b3fac8c9b793a0def7e5b40f5ee0950adba5be4e9667fad64257fea8 Jan 21 09:16:55 crc kubenswrapper[4618]: I0121 09:16:55.998704 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.275103 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerStarted","Data":"fe063e0551b772449512b0ec7d14ee2c33967b5fb57352f09f3c87f6f8b1e2ee"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.278452 4618 generic.go:334] "Generic (PLEG): container finished" podID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerID="610ebb6a065b0cc2985c175610107b4f6a79bc053993156cfa26d8d2cf77d4dc" exitCode=0 Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.278538 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" event={"ID":"aa6809e9-cec7-46ff-983a-2ae596e84add","Type":"ContainerDied","Data":"610ebb6a065b0cc2985c175610107b4f6a79bc053993156cfa26d8d2cf77d4dc"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.279806 4618 generic.go:334] "Generic (PLEG): container finished" podID="3a46fa0b-7185-4171-94e8-b997c2abfa30" containerID="bfe02ede99b1979c4e6cb090d5f75b56ec2a587d4f64c8909265c1d9eef2ffd9" exitCode=0 Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.279859 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" event={"ID":"3a46fa0b-7185-4171-94e8-b997c2abfa30","Type":"ContainerDied","Data":"bfe02ede99b1979c4e6cb090d5f75b56ec2a587d4f64c8909265c1d9eef2ffd9"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.281663 4618 generic.go:334] "Generic (PLEG): container finished" podID="f971f6d0-ae98-4309-a547-981c6960e545" containerID="8b4a2ccc581d74ab290fd4f35f5e92c0a7d3b03bf33066c95c0e8e6289fa7b42" exitCode=0 Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.281715 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" event={"ID":"f971f6d0-ae98-4309-a547-981c6960e545","Type":"ContainerDied","Data":"8b4a2ccc581d74ab290fd4f35f5e92c0a7d3b03bf33066c95c0e8e6289fa7b42"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.285480 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c74af7ae-2eee-4b63-8515-0230ddf143c8","Type":"ContainerStarted","Data":"07c40f291cfd3f5d0a4e9d742fd8d51723ed37a772932f5a5a4e041ea0136833"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.286885 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerStarted","Data":"a220c0767364c77f9016a122d039a01b5ea44cbef776b971e42c437447a464e5"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.288913 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33a99731-3bad-4a35-97bc-2431645071bb","Type":"ContainerStarted","Data":"f85a1c48a3ed65c3f7bf1db1e28188a6c8bc0165a92b4147746817d6fcff27d5"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.290625 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f0b3230e-10e8-4707-8944-b59b1870a4fc","Type":"ContainerStarted","Data":"04f8ca44b3fac8c9b793a0def7e5b40f5ee0950adba5be4e9667fad64257fea8"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.292945 4618 generic.go:334] "Generic (PLEG): container finished" podID="54756946-3764-4748-abb7-4a230bab6d21" containerID="baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4" exitCode=0 Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.292995 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" event={"ID":"54756946-3764-4748-abb7-4a230bab6d21","Type":"ContainerDied","Data":"baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.293012 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" event={"ID":"54756946-3764-4748-abb7-4a230bab6d21","Type":"ContainerStarted","Data":"fa8f7945b9c101b6a1c14307218fbeecfd667eb0ea87d626ed7a31729f836b71"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.293893 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"424be4c5-5cc7-4641-b497-f01556c3d8ea","Type":"ContainerStarted","Data":"f26dfe3226621b414640b4bfba542cc5ad35eb6efa5a1736bfc66f2dec1aac19"} Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.506764 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.567252 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc\") pod \"f971f6d0-ae98-4309-a547-981c6960e545\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.567297 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kllcd\" (UniqueName: \"kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd\") pod \"f971f6d0-ae98-4309-a547-981c6960e545\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.567324 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config\") pod \"f971f6d0-ae98-4309-a547-981c6960e545\" (UID: \"f971f6d0-ae98-4309-a547-981c6960e545\") " Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.569435 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.571316 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd" (OuterVolumeSpecName: "kube-api-access-kllcd") pod "f971f6d0-ae98-4309-a547-981c6960e545" (UID: "f971f6d0-ae98-4309-a547-981c6960e545"). InnerVolumeSpecName "kube-api-access-kllcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.590354 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config" (OuterVolumeSpecName: "config") pod "f971f6d0-ae98-4309-a547-981c6960e545" (UID: "f971f6d0-ae98-4309-a547-981c6960e545"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.591948 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f971f6d0-ae98-4309-a547-981c6960e545" (UID: "f971f6d0-ae98-4309-a547-981c6960e545"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.668494 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75\") pod \"3a46fa0b-7185-4171-94e8-b997c2abfa30\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.668587 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config\") pod \"3a46fa0b-7185-4171-94e8-b997c2abfa30\" (UID: \"3a46fa0b-7185-4171-94e8-b997c2abfa30\") " Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.668875 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.668892 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kllcd\" (UniqueName: \"kubernetes.io/projected/f971f6d0-ae98-4309-a547-981c6960e545-kube-api-access-kllcd\") on node \"crc\" DevicePath \"\"" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.668901 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f971f6d0-ae98-4309-a547-981c6960e545-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.679302 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75" (OuterVolumeSpecName: "kube-api-access-cmd75") pod "3a46fa0b-7185-4171-94e8-b997c2abfa30" (UID: "3a46fa0b-7185-4171-94e8-b997c2abfa30"). InnerVolumeSpecName "kube-api-access-cmd75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.679866 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config" (OuterVolumeSpecName: "config") pod "3a46fa0b-7185-4171-94e8-b997c2abfa30" (UID: "3a46fa0b-7185-4171-94e8-b997c2abfa30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.770355 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/3a46fa0b-7185-4171-94e8-b997c2abfa30-kube-api-access-cmd75\") on node \"crc\" DevicePath \"\"" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.770377 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a46fa0b-7185-4171-94e8-b997c2abfa30-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.958984 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:16:56 crc kubenswrapper[4618]: I0121 09:16:56.959031 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.305368 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" event={"ID":"3a46fa0b-7185-4171-94e8-b997c2abfa30","Type":"ContainerDied","Data":"d2fbacf3415929edf56b0e7616a150399c3f9d8659557f134baaac924244f844"} Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.305421 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-dp22c" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.305423 4618 scope.go:117] "RemoveContainer" containerID="bfe02ede99b1979c4e6cb090d5f75b56ec2a587d4f64c8909265c1d9eef2ffd9" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.311802 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" event={"ID":"54756946-3764-4748-abb7-4a230bab6d21","Type":"ContainerStarted","Data":"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba"} Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.311862 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.313205 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" event={"ID":"f971f6d0-ae98-4309-a547-981c6960e545","Type":"ContainerDied","Data":"b32854286e52fdc146ceb2c1e0d7a986d75b673347d4a2aa833b668bf6f7bded"} Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.313342 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-b6px4" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.314899 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" event={"ID":"aa6809e9-cec7-46ff-983a-2ae596e84add","Type":"ContainerStarted","Data":"e0b6379b137637b274434eaf5e9bd8a43f2190a5792c18f0cad01a94fbdb2603"} Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.315082 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.326134 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" podStartSLOduration=10.326123726 podStartE2EDuration="10.326123726s" podCreationTimestamp="2026-01-21 09:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:16:57.323726281 +0000 UTC m=+816.074193598" watchObservedRunningTime="2026-01-21 09:16:57.326123726 +0000 UTC m=+816.076591043" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.336598 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" podStartSLOduration=6.483135858 podStartE2EDuration="10.336590185s" podCreationTimestamp="2026-01-21 09:16:47 +0000 UTC" firstStartedPulling="2026-01-21 09:16:51.648874161 +0000 UTC m=+810.399341478" lastFinishedPulling="2026-01-21 09:16:55.502328488 +0000 UTC m=+814.252795805" observedRunningTime="2026-01-21 09:16:57.335960529 +0000 UTC m=+816.086427846" watchObservedRunningTime="2026-01-21 09:16:57.336590185 +0000 UTC m=+816.087057502" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.362093 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.367167 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-dp22c"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.378683 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.382090 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-b6px4"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.544947 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a46fa0b-7185-4171-94e8-b997c2abfa30" path="/var/lib/kubelet/pods/3a46fa0b-7185-4171-94e8-b997c2abfa30/volumes" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.545666 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f971f6d0-ae98-4309-a547-981c6960e545" path="/var/lib/kubelet/pods/f971f6d0-ae98-4309-a547-981c6960e545/volumes" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.614336 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 09:16:57 crc kubenswrapper[4618]: E0121 09:16:57.614616 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a46fa0b-7185-4171-94e8-b997c2abfa30" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.614629 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a46fa0b-7185-4171-94e8-b997c2abfa30" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: E0121 09:16:57.614640 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f971f6d0-ae98-4309-a547-981c6960e545" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.614645 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f971f6d0-ae98-4309-a547-981c6960e545" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.614766 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a46fa0b-7185-4171-94e8-b997c2abfa30" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.614775 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f971f6d0-ae98-4309-a547-981c6960e545" containerName="init" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.615481 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.617832 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.617919 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.619050 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.619058 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.619166 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5d8gj" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.623451 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.683813 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.683851 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.683873 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.684047 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.684198 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.684226 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.684264 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.684287 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ln8h\" (UniqueName: \"kubernetes.io/projected/dc37a59b-ed3a-4007-b6bc-da3078536c98-kube-api-access-9ln8h\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.752478 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v4n6j"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.753362 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.756432 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-z8pqg"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.757759 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.759985 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.760936 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.761068 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rnrqw" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.762062 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4n6j"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.766189 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z8pqg"] Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785308 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-combined-ca-bundle\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785517 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-scripts\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785557 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785572 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-ovn-controller-tls-certs\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785590 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6566w\" (UniqueName: \"kubernetes.io/projected/27ea43db-9444-46a2-aa4f-824245113798-kube-api-access-6566w\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785615 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27ea43db-9444-46a2-aa4f-824245113798-scripts\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785629 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-etc-ovs\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785644 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-run\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785661 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjwp\" (UniqueName: \"kubernetes.io/projected/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-kube-api-access-zkjwp\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785677 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-log-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785699 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785717 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785736 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785752 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ln8h\" (UniqueName: \"kubernetes.io/projected/dc37a59b-ed3a-4007-b6bc-da3078536c98-kube-api-access-9ln8h\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785773 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-lib\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785809 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-log\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785827 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785841 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785859 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785877 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.785903 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.787317 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.788600 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.790917 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.791274 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc37a59b-ed3a-4007-b6bc-da3078536c98-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.792703 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.805093 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.805371 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc37a59b-ed3a-4007-b6bc-da3078536c98-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.806722 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.822386 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ln8h\" (UniqueName: \"kubernetes.io/projected/dc37a59b-ed3a-4007-b6bc-da3078536c98-kube-api-access-9ln8h\") pod \"ovsdbserver-nb-0\" (UID: \"dc37a59b-ed3a-4007-b6bc-da3078536c98\") " pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.886888 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.886927 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-combined-ca-bundle\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.886959 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-scripts\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.886989 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-ovn-controller-tls-certs\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887005 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6566w\" (UniqueName: \"kubernetes.io/projected/27ea43db-9444-46a2-aa4f-824245113798-kube-api-access-6566w\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887029 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27ea43db-9444-46a2-aa4f-824245113798-scripts\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887044 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-etc-ovs\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887056 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-run\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887069 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjwp\" (UniqueName: \"kubernetes.io/projected/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-kube-api-access-zkjwp\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887089 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-log-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887123 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-lib\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887170 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-log\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887185 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887552 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.887630 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-run-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.888042 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-run\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.888087 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-var-log-ovn\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.888089 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-etc-ovs\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.888374 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-lib\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.888500 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/27ea43db-9444-46a2-aa4f-824245113798-var-log\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.889867 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-scripts\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.889922 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27ea43db-9444-46a2-aa4f-824245113798-scripts\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.890429 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-combined-ca-bundle\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.901946 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6566w\" (UniqueName: \"kubernetes.io/projected/27ea43db-9444-46a2-aa4f-824245113798-kube-api-access-6566w\") pod \"ovn-controller-ovs-z8pqg\" (UID: \"27ea43db-9444-46a2-aa4f-824245113798\") " pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.902356 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjwp\" (UniqueName: \"kubernetes.io/projected/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-kube-api-access-zkjwp\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.909452 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a889a44f-3ea7-4b43-b5ea-1f365a9611ac-ovn-controller-tls-certs\") pod \"ovn-controller-v4n6j\" (UID: \"a889a44f-3ea7-4b43-b5ea-1f365a9611ac\") " pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:57 crc kubenswrapper[4618]: I0121 09:16:57.936476 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 09:16:58 crc kubenswrapper[4618]: I0121 09:16:58.077154 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j" Jan 21 09:16:58 crc kubenswrapper[4618]: I0121 09:16:58.093442 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:16:59 crc kubenswrapper[4618]: I0121 09:16:59.313393 4618 scope.go:117] "RemoveContainer" containerID="8b4a2ccc581d74ab290fd4f35f5e92c0a7d3b03bf33066c95c0e8e6289fa7b42" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.083237 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 09:17:01 crc kubenswrapper[4618]: W0121 09:17:01.084654 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc37a59b_ed3a_4007_b6bc_da3078536c98.slice/crio-a920e5c8a8324e9dc1ae13c73ea7f127d7dc8396f8657648810ad5ac6d591268 WatchSource:0}: Error finding container a920e5c8a8324e9dc1ae13c73ea7f127d7dc8396f8657648810ad5ac6d591268: Status 404 returned error can't find the container with id a920e5c8a8324e9dc1ae13c73ea7f127d7dc8396f8657648810ad5ac6d591268 Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.110999 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4n6j"] Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.117432 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.118399 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: W0121 09:17:01.118592 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda889a44f_3ea7_4b43_b5ea_1f365a9611ac.slice/crio-bcfd4b8e9794094f50963eddd7b9cb1916bba97632a8af4095dfb1899388986e WatchSource:0}: Error finding container bcfd4b8e9794094f50963eddd7b9cb1916bba97632a8af4095dfb1899388986e: Status 404 returned error can't find the container with id bcfd4b8e9794094f50963eddd7b9cb1916bba97632a8af4095dfb1899388986e Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.119749 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.120008 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.120078 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rj67w" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.121271 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.128756 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132324 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132399 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132419 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzlj\" (UniqueName: \"kubernetes.io/projected/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-kube-api-access-znzlj\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132459 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132479 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132495 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132507 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-config\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.132658 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.180864 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z8pqg"] Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.233886 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234064 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzlj\" (UniqueName: \"kubernetes.io/projected/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-kube-api-access-znzlj\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234242 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234336 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234411 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234484 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-config\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234575 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234677 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.235103 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-config\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.235062 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.234608 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.235331 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.238587 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.238680 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.239026 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.246858 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzlj\" (UniqueName: \"kubernetes.io/projected/93d72e0b-9c67-4d3c-8eaf-b40cbf04df89-kube-api-access-znzlj\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.250348 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89\") " pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.336803 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"424be4c5-5cc7-4641-b497-f01556c3d8ea","Type":"ContainerStarted","Data":"6c1ffcb536f7e1130c4f4f221e9a970ab3a1c01c2f083868ce4d1eb7836e9a1f"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.337789 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z8pqg" event={"ID":"27ea43db-9444-46a2-aa4f-824245113798","Type":"ContainerStarted","Data":"19412443edae38a1da21187e95123b14678a45692c0eca010a492457f673df9d"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.339504 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c74af7ae-2eee-4b63-8515-0230ddf143c8","Type":"ContainerStarted","Data":"7b2631fd49caf28d1118fdfa040ce0694b5c656219d650d65cf61960b915de81"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.339626 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.340507 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4n6j" event={"ID":"a889a44f-3ea7-4b43-b5ea-1f365a9611ac","Type":"ContainerStarted","Data":"bcfd4b8e9794094f50963eddd7b9cb1916bba97632a8af4095dfb1899388986e"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.342283 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33a99731-3bad-4a35-97bc-2431645071bb","Type":"ContainerStarted","Data":"aff5a70fd740942cb0c642779020050c59f6131f622cba2d67187b13672c7d78"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.343237 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f0b3230e-10e8-4707-8944-b59b1870a4fc","Type":"ContainerStarted","Data":"79aa098486ca70bd9476d4f3f38d0fd3f3ab09e945c95eb71c1ff92d7e8d68eb"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.343294 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.344011 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc37a59b-ed3a-4007-b6bc-da3078536c98","Type":"ContainerStarted","Data":"a920e5c8a8324e9dc1ae13c73ea7f127d7dc8396f8657648810ad5ac6d591268"} Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.362034 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.653231493 podStartE2EDuration="8.362021232s" podCreationTimestamp="2026-01-21 09:16:53 +0000 UTC" firstStartedPulling="2026-01-21 09:16:55.988975723 +0000 UTC m=+814.739443040" lastFinishedPulling="2026-01-21 09:17:00.697765462 +0000 UTC m=+819.448232779" observedRunningTime="2026-01-21 09:17:01.360911763 +0000 UTC m=+820.111379080" watchObservedRunningTime="2026-01-21 09:17:01.362021232 +0000 UTC m=+820.112488550" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.401436 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=5.771394887 podStartE2EDuration="10.401423862s" podCreationTimestamp="2026-01-21 09:16:51 +0000 UTC" firstStartedPulling="2026-01-21 09:16:56.00019001 +0000 UTC m=+814.750657328" lastFinishedPulling="2026-01-21 09:17:00.630218985 +0000 UTC m=+819.380686303" observedRunningTime="2026-01-21 09:17:01.395950555 +0000 UTC m=+820.146417872" watchObservedRunningTime="2026-01-21 09:17:01.401423862 +0000 UTC m=+820.151891179" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.429795 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:01 crc kubenswrapper[4618]: I0121 09:17:01.869500 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 09:17:01 crc kubenswrapper[4618]: W0121 09:17:01.872954 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d72e0b_9c67_4d3c_8eaf_b40cbf04df89.slice/crio-f86052780b52896571e845b48078d02aba357555062d6766d74649dc873ac87e WatchSource:0}: Error finding container f86052780b52896571e845b48078d02aba357555062d6766d74649dc873ac87e: Status 404 returned error can't find the container with id f86052780b52896571e845b48078d02aba357555062d6766d74649dc873ac87e Jan 21 09:17:02 crc kubenswrapper[4618]: I0121 09:17:02.351355 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerStarted","Data":"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea"} Jan 21 09:17:02 crc kubenswrapper[4618]: I0121 09:17:02.354390 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89","Type":"ContainerStarted","Data":"f86052780b52896571e845b48078d02aba357555062d6766d74649dc873ac87e"} Jan 21 09:17:02 crc kubenswrapper[4618]: I0121 09:17:02.356785 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerStarted","Data":"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8"} Jan 21 09:17:02 crc kubenswrapper[4618]: I0121 09:17:02.750124 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:17:03 crc kubenswrapper[4618]: I0121 09:17:03.019338 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:17:03 crc kubenswrapper[4618]: I0121 09:17:03.051337 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:17:03 crc kubenswrapper[4618]: I0121 09:17:03.361180 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="dnsmasq-dns" containerID="cri-o://3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba" gracePeriod=10 Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.244391 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.367697 4618 generic.go:334] "Generic (PLEG): container finished" podID="54756946-3764-4748-abb7-4a230bab6d21" containerID="3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba" exitCode=0 Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.367762 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" event={"ID":"54756946-3764-4748-abb7-4a230bab6d21","Type":"ContainerDied","Data":"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba"} Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.367788 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" event={"ID":"54756946-3764-4748-abb7-4a230bab6d21","Type":"ContainerDied","Data":"fa8f7945b9c101b6a1c14307218fbeecfd667eb0ea87d626ed7a31729f836b71"} Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.367774 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-fcjz8" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.367803 4618 scope.go:117] "RemoveContainer" containerID="3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.369841 4618 generic.go:334] "Generic (PLEG): container finished" podID="424be4c5-5cc7-4641-b497-f01556c3d8ea" containerID="6c1ffcb536f7e1130c4f4f221e9a970ab3a1c01c2f083868ce4d1eb7836e9a1f" exitCode=0 Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.369886 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"424be4c5-5cc7-4641-b497-f01556c3d8ea","Type":"ContainerDied","Data":"6c1ffcb536f7e1130c4f4f221e9a970ab3a1c01c2f083868ce4d1eb7836e9a1f"} Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.371439 4618 generic.go:334] "Generic (PLEG): container finished" podID="33a99731-3bad-4a35-97bc-2431645071bb" containerID="aff5a70fd740942cb0c642779020050c59f6131f622cba2d67187b13672c7d78" exitCode=0 Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.371479 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33a99731-3bad-4a35-97bc-2431645071bb","Type":"ContainerDied","Data":"aff5a70fd740942cb0c642779020050c59f6131f622cba2d67187b13672c7d78"} Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.375493 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc\") pod \"54756946-3764-4748-abb7-4a230bab6d21\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.375571 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhjxw\" (UniqueName: \"kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw\") pod \"54756946-3764-4748-abb7-4a230bab6d21\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.375594 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config\") pod \"54756946-3764-4748-abb7-4a230bab6d21\" (UID: \"54756946-3764-4748-abb7-4a230bab6d21\") " Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.382999 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw" (OuterVolumeSpecName: "kube-api-access-bhjxw") pod "54756946-3764-4748-abb7-4a230bab6d21" (UID: "54756946-3764-4748-abb7-4a230bab6d21"). InnerVolumeSpecName "kube-api-access-bhjxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.406875 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54756946-3764-4748-abb7-4a230bab6d21" (UID: "54756946-3764-4748-abb7-4a230bab6d21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.407971 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config" (OuterVolumeSpecName: "config") pod "54756946-3764-4748-abb7-4a230bab6d21" (UID: "54756946-3764-4748-abb7-4a230bab6d21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.418234 4618 scope.go:117] "RemoveContainer" containerID="baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.440917 4618 scope.go:117] "RemoveContainer" containerID="3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba" Jan 21 09:17:04 crc kubenswrapper[4618]: E0121 09:17:04.441171 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba\": container with ID starting with 3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba not found: ID does not exist" containerID="3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.441197 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba"} err="failed to get container status \"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba\": rpc error: code = NotFound desc = could not find container \"3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba\": container with ID starting with 3b068fd209e8a42a34bbf4aa9dbe03d20cb9a43f6970830298ab141b8c65c8ba not found: ID does not exist" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.441215 4618 scope.go:117] "RemoveContainer" containerID="baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4" Jan 21 09:17:04 crc kubenswrapper[4618]: E0121 09:17:04.442051 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4\": container with ID starting with baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4 not found: ID does not exist" containerID="baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.442098 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4"} err="failed to get container status \"baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4\": rpc error: code = NotFound desc = could not find container \"baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4\": container with ID starting with baa781d81bf5ce326221acca596c7f21cb569b51ffcc9a8598045e70a44c9ba4 not found: ID does not exist" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.477245 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.477269 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhjxw\" (UniqueName: \"kubernetes.io/projected/54756946-3764-4748-abb7-4a230bab6d21-kube-api-access-bhjxw\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.477288 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54756946-3764-4748-abb7-4a230bab6d21-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.689589 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:17:04 crc kubenswrapper[4618]: I0121 09:17:04.694066 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-fcjz8"] Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.382721 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc37a59b-ed3a-4007-b6bc-da3078536c98","Type":"ContainerStarted","Data":"cc614bc3d07d08a13fdc0d01a0b76637e972ae068f3e5a189bdeadb6990a0740"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.385599 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"424be4c5-5cc7-4641-b497-f01556c3d8ea","Type":"ContainerStarted","Data":"cfe9625c1be82cdf51910c01a09ff6b3d1f0d58a7c5d1dd49d982b0d562e6c22"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.387420 4618 generic.go:334] "Generic (PLEG): container finished" podID="27ea43db-9444-46a2-aa4f-824245113798" containerID="57ab1c895f55ee73af8c317c8c61a47d378945698375f1fd6aea2a37c2787c40" exitCode=0 Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.387476 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z8pqg" event={"ID":"27ea43db-9444-46a2-aa4f-824245113798","Type":"ContainerDied","Data":"57ab1c895f55ee73af8c317c8c61a47d378945698375f1fd6aea2a37c2787c40"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.389706 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4n6j" event={"ID":"a889a44f-3ea7-4b43-b5ea-1f365a9611ac","Type":"ContainerStarted","Data":"97f2e8d236fd2c9fa7808d8e08f8aa2f002873ceb4ac7064c993c8bd5d577d78"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.389896 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-v4n6j" Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.391431 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"33a99731-3bad-4a35-97bc-2431645071bb","Type":"ContainerStarted","Data":"fc36a81044c336db667f7a3c272abbe4312b0b9992dc91e410cefffc86371192"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.392999 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89","Type":"ContainerStarted","Data":"da7d549fcd9da3742a279ff1416b7a40f1a2e76b4cfa6c84b6fe3a4749674647"} Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.401450 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.689601149 podStartE2EDuration="15.40143957s" podCreationTimestamp="2026-01-21 09:16:50 +0000 UTC" firstStartedPulling="2026-01-21 09:16:55.996848458 +0000 UTC m=+814.747315774" lastFinishedPulling="2026-01-21 09:17:00.708686879 +0000 UTC m=+819.459154195" observedRunningTime="2026-01-21 09:17:05.398361785 +0000 UTC m=+824.148829102" watchObservedRunningTime="2026-01-21 09:17:05.40143957 +0000 UTC m=+824.151906888" Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.438864 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.73846766 podStartE2EDuration="17.438833036s" podCreationTimestamp="2026-01-21 09:16:48 +0000 UTC" firstStartedPulling="2026-01-21 09:16:55.997084522 +0000 UTC m=+814.747551840" lastFinishedPulling="2026-01-21 09:17:00.697449898 +0000 UTC m=+819.447917216" observedRunningTime="2026-01-21 09:17:05.436294666 +0000 UTC m=+824.186761984" watchObservedRunningTime="2026-01-21 09:17:05.438833036 +0000 UTC m=+824.189300354" Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.450963 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v4n6j" podStartSLOduration=5.142338277 podStartE2EDuration="8.450949483s" podCreationTimestamp="2026-01-21 09:16:57 +0000 UTC" firstStartedPulling="2026-01-21 09:17:01.121187879 +0000 UTC m=+819.871655196" lastFinishedPulling="2026-01-21 09:17:04.429799085 +0000 UTC m=+823.180266402" observedRunningTime="2026-01-21 09:17:05.448628021 +0000 UTC m=+824.199095338" watchObservedRunningTime="2026-01-21 09:17:05.450949483 +0000 UTC m=+824.201416800" Jan 21 09:17:05 crc kubenswrapper[4618]: I0121 09:17:05.560481 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54756946-3764-4748-abb7-4a230bab6d21" path="/var/lib/kubelet/pods/54756946-3764-4748-abb7-4a230bab6d21/volumes" Jan 21 09:17:06 crc kubenswrapper[4618]: I0121 09:17:06.402882 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z8pqg" event={"ID":"27ea43db-9444-46a2-aa4f-824245113798","Type":"ContainerStarted","Data":"b7304dae0222d56d8c52eba8076a2cadae0c6b22155ce476cd0ad09461b0db31"} Jan 21 09:17:06 crc kubenswrapper[4618]: I0121 09:17:06.403256 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z8pqg" event={"ID":"27ea43db-9444-46a2-aa4f-824245113798","Type":"ContainerStarted","Data":"e37988af04606a406e2664e103e05f8ab3c7ec86e36154520a00bf98366bf417"} Jan 21 09:17:06 crc kubenswrapper[4618]: I0121 09:17:06.403321 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:17:06 crc kubenswrapper[4618]: I0121 09:17:06.404043 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:17:06 crc kubenswrapper[4618]: I0121 09:17:06.416563 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-z8pqg" podStartSLOduration=6.195497612 podStartE2EDuration="9.416546749s" podCreationTimestamp="2026-01-21 09:16:57 +0000 UTC" firstStartedPulling="2026-01-21 09:17:01.187375177 +0000 UTC m=+819.937842494" lastFinishedPulling="2026-01-21 09:17:04.408424313 +0000 UTC m=+823.158891631" observedRunningTime="2026-01-21 09:17:06.416063609 +0000 UTC m=+825.166530926" watchObservedRunningTime="2026-01-21 09:17:06.416546749 +0000 UTC m=+825.167014065" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.163978 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.406973 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93d72e0b-9c67-4d3c-8eaf-b40cbf04df89","Type":"ContainerStarted","Data":"2cd69b276467e905383e5a112f909d2b8a096f1e7412dbe78c22477bb5168a08"} Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.408627 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc37a59b-ed3a-4007-b6bc-da3078536c98","Type":"ContainerStarted","Data":"b9416d15483199a62e2d56bd1bf5a0903735cbb2d0d7a2f921f7a9d4ea50699e"} Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.421236 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.187734282 podStartE2EDuration="7.421218746s" podCreationTimestamp="2026-01-21 09:17:00 +0000 UTC" firstStartedPulling="2026-01-21 09:17:01.874686252 +0000 UTC m=+820.625153569" lastFinishedPulling="2026-01-21 09:17:07.108170716 +0000 UTC m=+825.858638033" observedRunningTime="2026-01-21 09:17:07.420192774 +0000 UTC m=+826.170660091" watchObservedRunningTime="2026-01-21 09:17:07.421218746 +0000 UTC m=+826.171686063" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.430857 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.433238 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.410149664 podStartE2EDuration="11.433219966s" podCreationTimestamp="2026-01-21 09:16:56 +0000 UTC" firstStartedPulling="2026-01-21 09:17:01.087168408 +0000 UTC m=+819.837635725" lastFinishedPulling="2026-01-21 09:17:07.11023871 +0000 UTC m=+825.860706027" observedRunningTime="2026-01-21 09:17:07.431925498 +0000 UTC m=+826.182392815" watchObservedRunningTime="2026-01-21 09:17:07.433219966 +0000 UTC m=+826.183687283" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.455425 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:07 crc kubenswrapper[4618]: I0121 09:17:07.937053 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 21 09:17:08 crc kubenswrapper[4618]: I0121 09:17:08.414134 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.443868 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.640529 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:09 crc kubenswrapper[4618]: E0121 09:17:09.640792 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="init" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.640808 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="init" Jan 21 09:17:09 crc kubenswrapper[4618]: E0121 09:17:09.640817 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="dnsmasq-dns" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.640822 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="dnsmasq-dns" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.640954 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="54756946-3764-4748-abb7-4a230bab6d21" containerName="dnsmasq-dns" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.641643 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.643095 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.649566 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.681284 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-svclb"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.682074 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.685216 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.691681 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-svclb"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751030 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovs-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751096 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-combined-ca-bundle\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751156 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751201 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751241 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j724\" (UniqueName: \"kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751357 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovn-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751415 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9b8s\" (UniqueName: \"kubernetes.io/projected/8a225614-1514-4820-8eff-8d760ef9a0b3-kube-api-access-q9b8s\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751502 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751596 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.751635 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225614-1514-4820-8eff-8d760ef9a0b3-config\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852689 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852739 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j724\" (UniqueName: \"kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852764 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovn-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852787 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9b8s\" (UniqueName: \"kubernetes.io/projected/8a225614-1514-4820-8eff-8d760ef9a0b3-kube-api-access-q9b8s\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852822 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852848 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852866 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225614-1514-4820-8eff-8d760ef9a0b3-config\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852885 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovs-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852913 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-combined-ca-bundle\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.852931 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853042 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovn-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853130 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a225614-1514-4820-8eff-8d760ef9a0b3-ovs-rundir\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853617 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a225614-1514-4820-8eff-8d760ef9a0b3-config\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.853732 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.857663 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-combined-ca-bundle\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.859620 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a225614-1514-4820-8eff-8d760ef9a0b3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.865132 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j724\" (UniqueName: \"kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724\") pod \"dnsmasq-dns-5b79764b65-5dp7k\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.869976 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9b8s\" (UniqueName: \"kubernetes.io/projected/8a225614-1514-4820-8eff-8d760ef9a0b3-kube-api-access-q9b8s\") pod \"ovn-controller-metrics-svclb\" (UID: \"8a225614-1514-4820-8eff-8d760ef9a0b3\") " pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.937958 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.953124 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.953742 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.971218 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.972439 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.973651 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.975080 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.981351 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:09 crc kubenswrapper[4618]: I0121 09:17:09.992638 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-svclb" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.055619 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.055668 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.055729 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.055751 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kld6q\" (UniqueName: \"kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.055792 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.156878 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.156955 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.156976 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kld6q\" (UniqueName: \"kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.157017 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.157052 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.159000 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.159357 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.159718 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.159774 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.171015 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kld6q\" (UniqueName: \"kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q\") pod \"dnsmasq-dns-586b989cdc-gnztn\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.296289 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.296341 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.342088 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.345478 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.373726 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.379249 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-svclb"] Jan 21 09:17:10 crc kubenswrapper[4618]: W0121 09:17:10.387558 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a225614_1514_4820_8eff_8d760ef9a0b3.slice/crio-c61dd900605e9d0fa457610fb187d2dec2e7656e9b00aa6f9ec6bf908d97e83d WatchSource:0}: Error finding container c61dd900605e9d0fa457610fb187d2dec2e7656e9b00aa6f9ec6bf908d97e83d: Status 404 returned error can't find the container with id c61dd900605e9d0fa457610fb187d2dec2e7656e9b00aa6f9ec6bf908d97e83d Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.424549 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-svclb" event={"ID":"8a225614-1514-4820-8eff-8d760ef9a0b3","Type":"ContainerStarted","Data":"c61dd900605e9d0fa457610fb187d2dec2e7656e9b00aa6f9ec6bf908d97e83d"} Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.425875 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" event={"ID":"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f","Type":"ContainerStarted","Data":"aae37b23d8de5debe98ccc0142c9553c1855d900e2a93dfc73782defb423ca1c"} Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.461286 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.493325 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.656663 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.658476 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.665062 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wxthx" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.665299 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.665467 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.671670 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.682679 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.708473 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:10 crc kubenswrapper[4618]: W0121 09:17:10.712360 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb73b5ff_e552_4823_bc6f_0eb876c0585e.slice/crio-385662482d663cd60c757b5d59546a58dc236294ecff4bc23deb4ea991c4a8fb WatchSource:0}: Error finding container 385662482d663cd60c757b5d59546a58dc236294ecff4bc23deb4ea991c4a8fb: Status 404 returned error can't find the container with id 385662482d663cd60c757b5d59546a58dc236294ecff4bc23deb4ea991c4a8fb Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772384 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffmx\" (UniqueName: \"kubernetes.io/projected/00770641-2364-454a-9b73-663281ad8df0-kube-api-access-2ffmx\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772474 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772598 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00770641-2364-454a-9b73-663281ad8df0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772790 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-scripts\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772842 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772875 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.772948 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-config\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874508 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-config\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874568 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffmx\" (UniqueName: \"kubernetes.io/projected/00770641-2364-454a-9b73-663281ad8df0-kube-api-access-2ffmx\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874596 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874635 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00770641-2364-454a-9b73-663281ad8df0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874690 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-scripts\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874714 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.874733 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.875815 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-config\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.875825 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00770641-2364-454a-9b73-663281ad8df0-scripts\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.875939 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00770641-2364-454a-9b73-663281ad8df0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.878491 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.878651 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.879396 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00770641-2364-454a-9b73-663281ad8df0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.886797 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffmx\" (UniqueName: \"kubernetes.io/projected/00770641-2364-454a-9b73-663281ad8df0-kube-api-access-2ffmx\") pod \"ovn-northd-0\" (UID: \"00770641-2364-454a-9b73-663281ad8df0\") " pod="openstack/ovn-northd-0" Jan 21 09:17:10 crc kubenswrapper[4618]: I0121 09:17:10.974923 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.320731 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.431992 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00770641-2364-454a-9b73-663281ad8df0","Type":"ContainerStarted","Data":"70a3b7b0081602b9536e7682e5eb782a39f1f955d2293b9eeef1838aa5180348"} Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.433043 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" event={"ID":"fb73b5ff-e552-4823-bc6f-0eb876c0585e","Type":"ContainerStarted","Data":"385662482d663cd60c757b5d59546a58dc236294ecff4bc23deb4ea991c4a8fb"} Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.735803 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.735862 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.796286 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.849019 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-48a1-account-create-update-mgr8l"] Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.850121 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.852502 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.858459 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-48a1-account-create-update-mgr8l"] Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.879603 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-w5c5w"] Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.880614 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.884789 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w5c5w"] Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.888829 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.888999 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hkl\" (UniqueName: \"kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.889063 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.889169 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hjk\" (UniqueName: \"kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.990547 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hkl\" (UniqueName: \"kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.990736 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.990778 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hjk\" (UniqueName: \"kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.990832 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.991430 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:11 crc kubenswrapper[4618]: I0121 09:17:11.991575 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.005634 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hjk\" (UniqueName: \"kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk\") pod \"keystone-48a1-account-create-update-mgr8l\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.006104 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hkl\" (UniqueName: \"kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl\") pod \"keystone-db-create-w5c5w\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.159180 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2nfqk"] Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.160416 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.169192 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.170008 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4385-account-create-update-4xshw"] Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.171004 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.174646 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.176588 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2nfqk"] Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.180753 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4385-account-create-update-4xshw"] Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.205877 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.310035 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.310421 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t5x2\" (UniqueName: \"kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.310457 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74nsj\" (UniqueName: \"kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.310484 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.412423 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.412492 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.412580 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t5x2\" (UniqueName: \"kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.412611 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74nsj\" (UniqueName: \"kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.413414 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.413711 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.428935 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74nsj\" (UniqueName: \"kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj\") pod \"placement-db-create-2nfqk\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.428965 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t5x2\" (UniqueName: \"kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2\") pod \"placement-4385-account-create-update-4xshw\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.476621 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.488775 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.562640 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.574316 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w5c5w"] Jan 21 09:17:12 crc kubenswrapper[4618]: W0121 09:17:12.578727 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2837ccc7_f854_4658_9c44_bc288e4dad4a.slice/crio-5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500 WatchSource:0}: Error finding container 5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500: Status 404 returned error can't find the container with id 5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500 Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.579211 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-48a1-account-create-update-mgr8l"] Jan 21 09:17:12 crc kubenswrapper[4618]: W0121 09:17:12.584669 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8693102_51ce_4b10_8263_8f4e32c29a42.slice/crio-36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a WatchSource:0}: Error finding container 36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a: Status 404 returned error can't find the container with id 36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.847952 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2nfqk"] Jan 21 09:17:12 crc kubenswrapper[4618]: W0121 09:17:12.850023 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb55c26a4_172d_4d4b_abb9_5acde43e75df.slice/crio-287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa WatchSource:0}: Error finding container 287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa: Status 404 returned error can't find the container with id 287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa Jan 21 09:17:12 crc kubenswrapper[4618]: I0121 09:17:12.966751 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4385-account-create-update-4xshw"] Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.447906 4618 generic.go:334] "Generic (PLEG): container finished" podID="be40f1a4-5983-4d88-8cb3-7a923c1f7d45" containerID="5bcf469c17fd3ad450902872d022584f8ffdc336d641778d53260b97a1733d4e" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.447987 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4385-account-create-update-4xshw" event={"ID":"be40f1a4-5983-4d88-8cb3-7a923c1f7d45","Type":"ContainerDied","Data":"5bcf469c17fd3ad450902872d022584f8ffdc336d641778d53260b97a1733d4e"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.448458 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4385-account-create-update-4xshw" event={"ID":"be40f1a4-5983-4d88-8cb3-7a923c1f7d45","Type":"ContainerStarted","Data":"309128a25c11f8790be9125fa9cf08426e3ed8644bd9bcaae41a8f4c26c3dc04"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.461442 4618 generic.go:334] "Generic (PLEG): container finished" podID="f8693102-51ce-4b10-8263-8f4e32c29a42" containerID="9063f83d107d6f66f2d6174199ebe5f0ac4c28046a82e5d5e456a1086dbf455b" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.461590 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-48a1-account-create-update-mgr8l" event={"ID":"f8693102-51ce-4b10-8263-8f4e32c29a42","Type":"ContainerDied","Data":"9063f83d107d6f66f2d6174199ebe5f0ac4c28046a82e5d5e456a1086dbf455b"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.461684 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-48a1-account-create-update-mgr8l" event={"ID":"f8693102-51ce-4b10-8263-8f4e32c29a42","Type":"ContainerStarted","Data":"36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.465249 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-svclb" event={"ID":"8a225614-1514-4820-8eff-8d760ef9a0b3","Type":"ContainerStarted","Data":"ee8be9fd352a99a7021366ec16b4311c595b1c55ed7bdaa9cba62f438a139838"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.467978 4618 generic.go:334] "Generic (PLEG): container finished" podID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerID="3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.468096 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" event={"ID":"fb73b5ff-e552-4823-bc6f-0eb876c0585e","Type":"ContainerDied","Data":"3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.473593 4618 generic.go:334] "Generic (PLEG): container finished" podID="2837ccc7-f854-4658-9c44-bc288e4dad4a" containerID="dbdb817b025e2510ae1d191a4f58054e9e8287c91183dd592861121055eaf439" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.473679 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w5c5w" event={"ID":"2837ccc7-f854-4658-9c44-bc288e4dad4a","Type":"ContainerDied","Data":"dbdb817b025e2510ae1d191a4f58054e9e8287c91183dd592861121055eaf439"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.473745 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w5c5w" event={"ID":"2837ccc7-f854-4658-9c44-bc288e4dad4a","Type":"ContainerStarted","Data":"5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.474914 4618 generic.go:334] "Generic (PLEG): container finished" podID="b55c26a4-172d-4d4b-abb9-5acde43e75df" containerID="3e270b57684dde95f75300c11054eb3dbc3ada609e7b71563c8bc1ff3cbc7551" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.475024 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nfqk" event={"ID":"b55c26a4-172d-4d4b-abb9-5acde43e75df","Type":"ContainerDied","Data":"3e270b57684dde95f75300c11054eb3dbc3ada609e7b71563c8bc1ff3cbc7551"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.475045 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nfqk" event={"ID":"b55c26a4-172d-4d4b-abb9-5acde43e75df","Type":"ContainerStarted","Data":"287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.477942 4618 generic.go:334] "Generic (PLEG): container finished" podID="41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" containerID="db1f436e0c5afe6e4fecb98aa7a947201fa4561be6d4b96f403b5e530d7d0381" exitCode=0 Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.477981 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" event={"ID":"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f","Type":"ContainerDied","Data":"db1f436e0c5afe6e4fecb98aa7a947201fa4561be6d4b96f403b5e530d7d0381"} Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.506908 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-svclb" podStartSLOduration=4.506805442 podStartE2EDuration="4.506805442s" podCreationTimestamp="2026-01-21 09:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:17:13.503651001 +0000 UTC m=+832.254118318" watchObservedRunningTime="2026-01-21 09:17:13.506805442 +0000 UTC m=+832.257272759" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.720872 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.841462 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config\") pod \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.841571 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j724\" (UniqueName: \"kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724\") pod \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.841622 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc\") pod \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.841659 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb\") pod \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\" (UID: \"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f\") " Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.846423 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724" (OuterVolumeSpecName: "kube-api-access-2j724") pod "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" (UID: "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f"). InnerVolumeSpecName "kube-api-access-2j724". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.858312 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config" (OuterVolumeSpecName: "config") pod "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" (UID: "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.858609 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" (UID: "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.859475 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" (UID: "41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.943951 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.943983 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j724\" (UniqueName: \"kubernetes.io/projected/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-kube-api-access-2j724\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.943997 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:13 crc kubenswrapper[4618]: I0121 09:17:13.944006 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.004859 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.062689 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.085236 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:14 crc kubenswrapper[4618]: E0121 09:17:14.085631 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" containerName="init" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.085647 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" containerName="init" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.085822 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" containerName="init" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.086603 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.094789 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.249182 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.249228 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.249254 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.249759 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2cq6\" (UniqueName: \"kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.250054 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.354302 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.354400 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.354431 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.354462 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2cq6\" (UniqueName: \"kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.354515 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.355495 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.355541 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.356059 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.356128 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.377015 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2cq6\" (UniqueName: \"kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6\") pod \"dnsmasq-dns-67fdf7998c-qcpxr\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.401386 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.491661 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" event={"ID":"41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f","Type":"ContainerDied","Data":"aae37b23d8de5debe98ccc0142c9553c1855d900e2a93dfc73782defb423ca1c"} Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.491846 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-5dp7k" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.491881 4618 scope.go:117] "RemoveContainer" containerID="db1f436e0c5afe6e4fecb98aa7a947201fa4561be6d4b96f403b5e530d7d0381" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.495982 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00770641-2364-454a-9b73-663281ad8df0","Type":"ContainerStarted","Data":"75eb64bd07a41b73142ffd33bdb57dc7a642ba7044f4236cc6505b31a56cd82c"} Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.496010 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00770641-2364-454a-9b73-663281ad8df0","Type":"ContainerStarted","Data":"62034e4f803576c6d31ed26da702cd55556352448f012e4e3f439b652cb01e61"} Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.496821 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.500091 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" event={"ID":"fb73b5ff-e552-4823-bc6f-0eb876c0585e","Type":"ContainerStarted","Data":"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca"} Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.525901 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.583601487 podStartE2EDuration="4.5258847s" podCreationTimestamp="2026-01-21 09:17:10 +0000 UTC" firstStartedPulling="2026-01-21 09:17:11.32626686 +0000 UTC m=+830.076734177" lastFinishedPulling="2026-01-21 09:17:13.268550074 +0000 UTC m=+832.019017390" observedRunningTime="2026-01-21 09:17:14.523717288 +0000 UTC m=+833.274184595" watchObservedRunningTime="2026-01-21 09:17:14.5258847 +0000 UTC m=+833.276352017" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.546863 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" podStartSLOduration=5.546840611 podStartE2EDuration="5.546840611s" podCreationTimestamp="2026-01-21 09:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:17:14.544403362 +0000 UTC m=+833.294870679" watchObservedRunningTime="2026-01-21 09:17:14.546840611 +0000 UTC m=+833.297307928" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.579451 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.581823 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-5dp7k"] Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.787585 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:14 crc kubenswrapper[4618]: W0121 09:17:14.792154 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1bfd2f4_8562_470f_b4ec_ac051f8565d1.slice/crio-c2797755ab18079c42f6fcb2b20975c2832985f25c256c49d85d24b28d45ed0f WatchSource:0}: Error finding container c2797755ab18079c42f6fcb2b20975c2832985f25c256c49d85d24b28d45ed0f: Status 404 returned error can't find the container with id c2797755ab18079c42f6fcb2b20975c2832985f25c256c49d85d24b28d45ed0f Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.825360 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.845076 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.854394 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.907850 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.962974 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74nsj\" (UniqueName: \"kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj\") pod \"b55c26a4-172d-4d4b-abb9-5acde43e75df\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.963015 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hkl\" (UniqueName: \"kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl\") pod \"2837ccc7-f854-4658-9c44-bc288e4dad4a\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.963061 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts\") pod \"b55c26a4-172d-4d4b-abb9-5acde43e75df\" (UID: \"b55c26a4-172d-4d4b-abb9-5acde43e75df\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.963092 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts\") pod \"2837ccc7-f854-4658-9c44-bc288e4dad4a\" (UID: \"2837ccc7-f854-4658-9c44-bc288e4dad4a\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.963124 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts\") pod \"f8693102-51ce-4b10-8263-8f4e32c29a42\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.963300 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8hjk\" (UniqueName: \"kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk\") pod \"f8693102-51ce-4b10-8263-8f4e32c29a42\" (UID: \"f8693102-51ce-4b10-8263-8f4e32c29a42\") " Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.964021 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2837ccc7-f854-4658-9c44-bc288e4dad4a" (UID: "2837ccc7-f854-4658-9c44-bc288e4dad4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.964028 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b55c26a4-172d-4d4b-abb9-5acde43e75df" (UID: "b55c26a4-172d-4d4b-abb9-5acde43e75df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.964070 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8693102-51ce-4b10-8263-8f4e32c29a42" (UID: "f8693102-51ce-4b10-8263-8f4e32c29a42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.967374 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj" (OuterVolumeSpecName: "kube-api-access-74nsj") pod "b55c26a4-172d-4d4b-abb9-5acde43e75df" (UID: "b55c26a4-172d-4d4b-abb9-5acde43e75df"). InnerVolumeSpecName "kube-api-access-74nsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.967584 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl" (OuterVolumeSpecName: "kube-api-access-m9hkl") pod "2837ccc7-f854-4658-9c44-bc288e4dad4a" (UID: "2837ccc7-f854-4658-9c44-bc288e4dad4a"). InnerVolumeSpecName "kube-api-access-m9hkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:14 crc kubenswrapper[4618]: I0121 09:17:14.967675 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk" (OuterVolumeSpecName: "kube-api-access-l8hjk") pod "f8693102-51ce-4b10-8263-8f4e32c29a42" (UID: "f8693102-51ce-4b10-8263-8f4e32c29a42"). InnerVolumeSpecName "kube-api-access-l8hjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.064644 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t5x2\" (UniqueName: \"kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2\") pod \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.064742 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts\") pod \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\" (UID: \"be40f1a4-5983-4d88-8cb3-7a923c1f7d45\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065501 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be40f1a4-5983-4d88-8cb3-7a923c1f7d45" (UID: "be40f1a4-5983-4d88-8cb3-7a923c1f7d45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065624 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8hjk\" (UniqueName: \"kubernetes.io/projected/f8693102-51ce-4b10-8263-8f4e32c29a42-kube-api-access-l8hjk\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065652 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74nsj\" (UniqueName: \"kubernetes.io/projected/b55c26a4-172d-4d4b-abb9-5acde43e75df-kube-api-access-74nsj\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065663 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9hkl\" (UniqueName: \"kubernetes.io/projected/2837ccc7-f854-4658-9c44-bc288e4dad4a-kube-api-access-m9hkl\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065673 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55c26a4-172d-4d4b-abb9-5acde43e75df-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065687 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2837ccc7-f854-4658-9c44-bc288e4dad4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.065697 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8693102-51ce-4b10-8263-8f4e32c29a42-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.068513 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2" (OuterVolumeSpecName: "kube-api-access-5t5x2") pod "be40f1a4-5983-4d88-8cb3-7a923c1f7d45" (UID: "be40f1a4-5983-4d88-8cb3-7a923c1f7d45"). InnerVolumeSpecName "kube-api-access-5t5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.161808 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.162241 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55c26a4-172d-4d4b-abb9-5acde43e75df" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162256 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55c26a4-172d-4d4b-abb9-5acde43e75df" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.162280 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2837ccc7-f854-4658-9c44-bc288e4dad4a" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162286 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2837ccc7-f854-4658-9c44-bc288e4dad4a" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.162293 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be40f1a4-5983-4d88-8cb3-7a923c1f7d45" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162299 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="be40f1a4-5983-4d88-8cb3-7a923c1f7d45" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.162315 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8693102-51ce-4b10-8263-8f4e32c29a42" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162330 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8693102-51ce-4b10-8263-8f4e32c29a42" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162513 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55c26a4-172d-4d4b-abb9-5acde43e75df" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162524 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2837ccc7-f854-4658-9c44-bc288e4dad4a" containerName="mariadb-database-create" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162532 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8693102-51ce-4b10-8263-8f4e32c29a42" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.162541 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="be40f1a4-5983-4d88-8cb3-7a923c1f7d45" containerName="mariadb-account-create-update" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.166721 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.168160 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.170484 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.170519 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.170878 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-l96ww" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.173181 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t5x2\" (UniqueName: \"kubernetes.io/projected/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-kube-api-access-5t5x2\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.173200 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be40f1a4-5983-4d88-8cb3-7a923c1f7d45-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.180500 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.274741 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.274866 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.274899 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlk9\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-kube-api-access-6tlk9\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.274949 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-cache\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.275068 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-lock\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.342662 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376416 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-lock\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376460 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376536 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376560 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlk9\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-kube-api-access-6tlk9\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376589 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-cache\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.376706 4618 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.376738 4618 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.376789 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift podName:67864b8b-0c06-4f06-8b43-87fcdd8a3d42 nodeName:}" failed. No retries permitted until 2026-01-21 09:17:15.876773086 +0000 UTC m=+834.627240404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift") pod "swift-storage-0" (UID: "67864b8b-0c06-4f06-8b43-87fcdd8a3d42") : configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376815 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376925 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-cache\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.376819 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-lock\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.391638 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.392417 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlk9\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-kube-api-access-6tlk9\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.508341 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2nfqk" event={"ID":"b55c26a4-172d-4d4b-abb9-5acde43e75df","Type":"ContainerDied","Data":"287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.508392 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287e1133ddddeb31a2639d481d6e73df579254533bf2016cf8f7c961fe0edeaa" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.508578 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2nfqk" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.510557 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4385-account-create-update-4xshw" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.510550 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4385-account-create-update-4xshw" event={"ID":"be40f1a4-5983-4d88-8cb3-7a923c1f7d45","Type":"ContainerDied","Data":"309128a25c11f8790be9125fa9cf08426e3ed8644bd9bcaae41a8f4c26c3dc04"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.510660 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="309128a25c11f8790be9125fa9cf08426e3ed8644bd9bcaae41a8f4c26c3dc04" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.511936 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-48a1-account-create-update-mgr8l" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.511930 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-48a1-account-create-update-mgr8l" event={"ID":"f8693102-51ce-4b10-8263-8f4e32c29a42","Type":"ContainerDied","Data":"36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.512206 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d8ed8f080c00eada330347236f787a5f5c803a1cc40b0a5be51e89015ba60a" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.513298 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w5c5w" event={"ID":"2837ccc7-f854-4658-9c44-bc288e4dad4a","Type":"ContainerDied","Data":"5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.513357 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bdd766f9e09c7c5403ca9f0ab92ea3d13192bb38b054d2ee8330fd466dff500" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.513311 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w5c5w" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.514529 4618 generic.go:334] "Generic (PLEG): container finished" podID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerID="459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd" exitCode=0 Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.514623 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" event={"ID":"b1bfd2f4-8562-470f-b4ec-ac051f8565d1","Type":"ContainerDied","Data":"459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.514653 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" event={"ID":"b1bfd2f4-8562-470f-b4ec-ac051f8565d1","Type":"ContainerStarted","Data":"c2797755ab18079c42f6fcb2b20975c2832985f25c256c49d85d24b28d45ed0f"} Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.514843 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="dnsmasq-dns" containerID="cri-o://33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca" gracePeriod=10 Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.554459 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f" path="/var/lib/kubelet/pods/41a2fb5c-d945-4cdf-90e8-d1a7aa6de59f/volumes" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.653982 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-l5bzv"] Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.655013 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.656452 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.656626 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.656759 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.663265 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-l5bzv"] Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.687333 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.687566 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.687703 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgh7\" (UniqueName: \"kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.687869 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.687955 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.688081 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.688184 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788741 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788813 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788838 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgh7\" (UniqueName: \"kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788884 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788900 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788925 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.788969 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.789445 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.789961 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.790419 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.792715 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.792717 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.793039 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.802341 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgh7\" (UniqueName: \"kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7\") pod \"swift-ring-rebalance-l5bzv\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.858202 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.890273 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb\") pod \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.890319 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config\") pod \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.890371 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb\") pod \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.890432 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc\") pod \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.890757 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kld6q\" (UniqueName: \"kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q\") pod \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\" (UID: \"fb73b5ff-e552-4823-bc6f-0eb876c0585e\") " Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.891044 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.891238 4618 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.891257 4618 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: E0121 09:17:15.891302 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift podName:67864b8b-0c06-4f06-8b43-87fcdd8a3d42 nodeName:}" failed. No retries permitted until 2026-01-21 09:17:16.891286847 +0000 UTC m=+835.641754165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift") pod "swift-storage-0" (UID: "67864b8b-0c06-4f06-8b43-87fcdd8a3d42") : configmap "swift-ring-files" not found Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.899555 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q" (OuterVolumeSpecName: "kube-api-access-kld6q") pod "fb73b5ff-e552-4823-bc6f-0eb876c0585e" (UID: "fb73b5ff-e552-4823-bc6f-0eb876c0585e"). InnerVolumeSpecName "kube-api-access-kld6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.920049 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config" (OuterVolumeSpecName: "config") pod "fb73b5ff-e552-4823-bc6f-0eb876c0585e" (UID: "fb73b5ff-e552-4823-bc6f-0eb876c0585e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.920181 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb73b5ff-e552-4823-bc6f-0eb876c0585e" (UID: "fb73b5ff-e552-4823-bc6f-0eb876c0585e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.921378 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb73b5ff-e552-4823-bc6f-0eb876c0585e" (UID: "fb73b5ff-e552-4823-bc6f-0eb876c0585e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.924053 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb73b5ff-e552-4823-bc6f-0eb876c0585e" (UID: "fb73b5ff-e552-4823-bc6f-0eb876c0585e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.980899 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.993675 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.993711 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.993722 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.993731 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb73b5ff-e552-4823-bc6f-0eb876c0585e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:15 crc kubenswrapper[4618]: I0121 09:17:15.993741 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kld6q\" (UniqueName: \"kubernetes.io/projected/fb73b5ff-e552-4823-bc6f-0eb876c0585e-kube-api-access-kld6q\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.188281 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-l5bzv"] Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.524352 4618 generic.go:334] "Generic (PLEG): container finished" podID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerID="33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca" exitCode=0 Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.524415 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.524416 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" event={"ID":"fb73b5ff-e552-4823-bc6f-0eb876c0585e","Type":"ContainerDied","Data":"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca"} Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.524879 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-gnztn" event={"ID":"fb73b5ff-e552-4823-bc6f-0eb876c0585e","Type":"ContainerDied","Data":"385662482d663cd60c757b5d59546a58dc236294ecff4bc23deb4ea991c4a8fb"} Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.524895 4618 scope.go:117] "RemoveContainer" containerID="33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.526899 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" event={"ID":"b1bfd2f4-8562-470f-b4ec-ac051f8565d1","Type":"ContainerStarted","Data":"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204"} Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.527016 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.528563 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l5bzv" event={"ID":"a34a0fe1-3391-4b76-8274-d817bcca6d03","Type":"ContainerStarted","Data":"b115c92ed2df8f530f294bef352f3b6bc510e264468f717e060dab0234cb9b7c"} Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.542799 4618 scope.go:117] "RemoveContainer" containerID="3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.544043 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" podStartSLOduration=2.544030459 podStartE2EDuration="2.544030459s" podCreationTimestamp="2026-01-21 09:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:17:16.540740334 +0000 UTC m=+835.291207650" watchObservedRunningTime="2026-01-21 09:17:16.544030459 +0000 UTC m=+835.294497776" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.556656 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.562044 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-gnztn"] Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.562102 4618 scope.go:117] "RemoveContainer" containerID="33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca" Jan 21 09:17:16 crc kubenswrapper[4618]: E0121 09:17:16.562611 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca\": container with ID starting with 33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca not found: ID does not exist" containerID="33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.562647 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca"} err="failed to get container status \"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca\": rpc error: code = NotFound desc = could not find container \"33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca\": container with ID starting with 33311843f7eab8165afd8057029081b90a67075ff7aae262bd27038b93e583ca not found: ID does not exist" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.562665 4618 scope.go:117] "RemoveContainer" containerID="3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee" Jan 21 09:17:16 crc kubenswrapper[4618]: E0121 09:17:16.562980 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee\": container with ID starting with 3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee not found: ID does not exist" containerID="3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.563000 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee"} err="failed to get container status \"3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee\": rpc error: code = NotFound desc = could not find container \"3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee\": container with ID starting with 3ccafc08659b9da6c08f3e42a08d2ac46be12a844195ed4e647318e946ce37ee not found: ID does not exist" Jan 21 09:17:16 crc kubenswrapper[4618]: I0121 09:17:16.909212 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:16 crc kubenswrapper[4618]: E0121 09:17:16.909436 4618 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 09:17:16 crc kubenswrapper[4618]: E0121 09:17:16.909464 4618 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 09:17:16 crc kubenswrapper[4618]: E0121 09:17:16.909517 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift podName:67864b8b-0c06-4f06-8b43-87fcdd8a3d42 nodeName:}" failed. No retries permitted until 2026-01-21 09:17:18.909500355 +0000 UTC m=+837.659967673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift") pod "swift-storage-0" (UID: "67864b8b-0c06-4f06-8b43-87fcdd8a3d42") : configmap "swift-ring-files" not found Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.260834 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fbbjn"] Jan 21 09:17:17 crc kubenswrapper[4618]: E0121 09:17:17.261192 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="dnsmasq-dns" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.261205 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="dnsmasq-dns" Jan 21 09:17:17 crc kubenswrapper[4618]: E0121 09:17:17.261222 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="init" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.261229 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="init" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.261405 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" containerName="dnsmasq-dns" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.261874 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.271043 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fbbjn"] Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.319093 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.319224 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sftlh\" (UniqueName: \"kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.365847 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6384-account-create-update-8kht2"] Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.367619 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.369246 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.372060 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6384-account-create-update-8kht2"] Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.421434 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.421531 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656hj\" (UniqueName: \"kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.421562 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sftlh\" (UniqueName: \"kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.421661 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.422073 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.445734 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sftlh\" (UniqueName: \"kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh\") pod \"glance-db-create-fbbjn\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.523660 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656hj\" (UniqueName: \"kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.524043 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.526312 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.541136 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656hj\" (UniqueName: \"kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj\") pod \"glance-6384-account-create-update-8kht2\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.546043 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb73b5ff-e552-4823-bc6f-0eb876c0585e" path="/var/lib/kubelet/pods/fb73b5ff-e552-4823-bc6f-0eb876c0585e/volumes" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.579376 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:17 crc kubenswrapper[4618]: I0121 09:17:17.681116 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:18 crc kubenswrapper[4618]: I0121 09:17:18.946239 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:18 crc kubenswrapper[4618]: E0121 09:17:18.946650 4618 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 09:17:18 crc kubenswrapper[4618]: E0121 09:17:18.946663 4618 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 09:17:18 crc kubenswrapper[4618]: E0121 09:17:18.946703 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift podName:67864b8b-0c06-4f06-8b43-87fcdd8a3d42 nodeName:}" failed. No retries permitted until 2026-01-21 09:17:22.946690637 +0000 UTC m=+841.697157954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift") pod "swift-storage-0" (UID: "67864b8b-0c06-4f06-8b43-87fcdd8a3d42") : configmap "swift-ring-files" not found Jan 21 09:17:18 crc kubenswrapper[4618]: I0121 09:17:18.964624 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-g8jg6"] Jan 21 09:17:18 crc kubenswrapper[4618]: I0121 09:17:18.965589 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:18 crc kubenswrapper[4618]: I0121 09:17:18.967685 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 09:17:18 crc kubenswrapper[4618]: I0121 09:17:18.970965 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g8jg6"] Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.148723 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v276f\" (UniqueName: \"kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.148819 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: W0121 09:17:19.205564 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice/crio-6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570 WatchSource:0}: Error finding container 6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570: Status 404 returned error can't find the container with id 6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570 Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.213822 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fbbjn"] Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.243705 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6384-account-create-update-8kht2"] Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.250504 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v276f\" (UniqueName: \"kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.250680 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.251542 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.266905 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v276f\" (UniqueName: \"kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f\") pod \"root-account-create-update-g8jg6\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.278864 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.545424 4618 generic.go:334] "Generic (PLEG): container finished" podID="76cf94b1-8904-4389-8ef3-8dd36ea02ecf" containerID="447b5e82b2477dfc53d662a416596a09b3a32b25f5eef55f5f764aeb64a9b5ca" exitCode=0 Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.545680 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6384-account-create-update-8kht2" event={"ID":"76cf94b1-8904-4389-8ef3-8dd36ea02ecf","Type":"ContainerDied","Data":"447b5e82b2477dfc53d662a416596a09b3a32b25f5eef55f5f764aeb64a9b5ca"} Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.545701 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6384-account-create-update-8kht2" event={"ID":"76cf94b1-8904-4389-8ef3-8dd36ea02ecf","Type":"ContainerStarted","Data":"49f56348f5007fdd697a5b028cbc9dcdf9a987a9e1084485c5e88d97ff4178c9"} Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.547810 4618 generic.go:334] "Generic (PLEG): container finished" podID="2858edc9-0823-41b9-9c3a-ca8eecb450fa" containerID="af9e0729408e82eec4aca27f3402390aaed3ef37c6d7a2df988bc8348db2c012" exitCode=0 Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.547850 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fbbjn" event={"ID":"2858edc9-0823-41b9-9c3a-ca8eecb450fa","Type":"ContainerDied","Data":"af9e0729408e82eec4aca27f3402390aaed3ef37c6d7a2df988bc8348db2c012"} Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.547865 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fbbjn" event={"ID":"2858edc9-0823-41b9-9c3a-ca8eecb450fa","Type":"ContainerStarted","Data":"6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570"} Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.548954 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l5bzv" event={"ID":"a34a0fe1-3391-4b76-8274-d817bcca6d03","Type":"ContainerStarted","Data":"56833f08d16e221c9db181c165afd00f5fbcf02f8c4e2b4d3fbcde896a922ed9"} Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.581529 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-l5bzv" podStartSLOduration=1.969430309 podStartE2EDuration="4.581514694s" podCreationTimestamp="2026-01-21 09:17:15 +0000 UTC" firstStartedPulling="2026-01-21 09:17:16.207268733 +0000 UTC m=+834.957736050" lastFinishedPulling="2026-01-21 09:17:18.819353108 +0000 UTC m=+837.569820435" observedRunningTime="2026-01-21 09:17:19.580693548 +0000 UTC m=+838.331160864" watchObservedRunningTime="2026-01-21 09:17:19.581514694 +0000 UTC m=+838.331982011" Jan 21 09:17:19 crc kubenswrapper[4618]: I0121 09:17:19.639322 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g8jg6"] Jan 21 09:17:19 crc kubenswrapper[4618]: W0121 09:17:19.648105 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda270ff81_6a20_462e_aa6a_5962f9109650.slice/crio-b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887 WatchSource:0}: Error finding container b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887: Status 404 returned error can't find the container with id b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887 Jan 21 09:17:20 crc kubenswrapper[4618]: I0121 09:17:20.556382 4618 generic.go:334] "Generic (PLEG): container finished" podID="a270ff81-6a20-462e-aa6a-5962f9109650" containerID="eb8a1155b53479e61967646e83499e2dd931bdd152bd2de1af221c9cafafdb08" exitCode=0 Jan 21 09:17:20 crc kubenswrapper[4618]: I0121 09:17:20.557043 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g8jg6" event={"ID":"a270ff81-6a20-462e-aa6a-5962f9109650","Type":"ContainerDied","Data":"eb8a1155b53479e61967646e83499e2dd931bdd152bd2de1af221c9cafafdb08"} Jan 21 09:17:20 crc kubenswrapper[4618]: I0121 09:17:20.557065 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g8jg6" event={"ID":"a270ff81-6a20-462e-aa6a-5962f9109650","Type":"ContainerStarted","Data":"b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887"} Jan 21 09:17:20 crc kubenswrapper[4618]: I0121 09:17:20.900552 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:20 crc kubenswrapper[4618]: I0121 09:17:20.905301 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.072541 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts\") pod \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.072658 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sftlh\" (UniqueName: \"kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh\") pod \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.072756 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts\") pod \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\" (UID: \"2858edc9-0823-41b9-9c3a-ca8eecb450fa\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.072776 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-656hj\" (UniqueName: \"kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj\") pod \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\" (UID: \"76cf94b1-8904-4389-8ef3-8dd36ea02ecf\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.073303 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76cf94b1-8904-4389-8ef3-8dd36ea02ecf" (UID: "76cf94b1-8904-4389-8ef3-8dd36ea02ecf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.073803 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2858edc9-0823-41b9-9c3a-ca8eecb450fa" (UID: "2858edc9-0823-41b9-9c3a-ca8eecb450fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.078367 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj" (OuterVolumeSpecName: "kube-api-access-656hj") pod "76cf94b1-8904-4389-8ef3-8dd36ea02ecf" (UID: "76cf94b1-8904-4389-8ef3-8dd36ea02ecf"). InnerVolumeSpecName "kube-api-access-656hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.078405 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh" (OuterVolumeSpecName: "kube-api-access-sftlh") pod "2858edc9-0823-41b9-9c3a-ca8eecb450fa" (UID: "2858edc9-0823-41b9-9c3a-ca8eecb450fa"). InnerVolumeSpecName "kube-api-access-sftlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.174039 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2858edc9-0823-41b9-9c3a-ca8eecb450fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.174081 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-656hj\" (UniqueName: \"kubernetes.io/projected/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-kube-api-access-656hj\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.174093 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76cf94b1-8904-4389-8ef3-8dd36ea02ecf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.174101 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sftlh\" (UniqueName: \"kubernetes.io/projected/2858edc9-0823-41b9-9c3a-ca8eecb450fa-kube-api-access-sftlh\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.568544 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6384-account-create-update-8kht2" event={"ID":"76cf94b1-8904-4389-8ef3-8dd36ea02ecf","Type":"ContainerDied","Data":"49f56348f5007fdd697a5b028cbc9dcdf9a987a9e1084485c5e88d97ff4178c9"} Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.569260 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f56348f5007fdd697a5b028cbc9dcdf9a987a9e1084485c5e88d97ff4178c9" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.568554 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6384-account-create-update-8kht2" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.569972 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fbbjn" event={"ID":"2858edc9-0823-41b9-9c3a-ca8eecb450fa","Type":"ContainerDied","Data":"6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570"} Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.569993 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fbbjn" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.570005 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d75f1a6c15561681d22e7b152ab06c66dcf7249a93e45bc99b041fbdc85b570" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.812883 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.985953 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v276f\" (UniqueName: \"kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f\") pod \"a270ff81-6a20-462e-aa6a-5962f9109650\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.986518 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts\") pod \"a270ff81-6a20-462e-aa6a-5962f9109650\" (UID: \"a270ff81-6a20-462e-aa6a-5962f9109650\") " Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.987038 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a270ff81-6a20-462e-aa6a-5962f9109650" (UID: "a270ff81-6a20-462e-aa6a-5962f9109650"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.987621 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a270ff81-6a20-462e-aa6a-5962f9109650-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:21 crc kubenswrapper[4618]: I0121 09:17:21.990945 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f" (OuterVolumeSpecName: "kube-api-access-v276f") pod "a270ff81-6a20-462e-aa6a-5962f9109650" (UID: "a270ff81-6a20-462e-aa6a-5962f9109650"). InnerVolumeSpecName "kube-api-access-v276f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.088790 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v276f\" (UniqueName: \"kubernetes.io/projected/a270ff81-6a20-462e-aa6a-5962f9109650-kube-api-access-v276f\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.494530 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wmsxn"] Jan 21 09:17:22 crc kubenswrapper[4618]: E0121 09:17:22.494841 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2858edc9-0823-41b9-9c3a-ca8eecb450fa" containerName="mariadb-database-create" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.494861 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2858edc9-0823-41b9-9c3a-ca8eecb450fa" containerName="mariadb-database-create" Jan 21 09:17:22 crc kubenswrapper[4618]: E0121 09:17:22.494870 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a270ff81-6a20-462e-aa6a-5962f9109650" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.494877 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="a270ff81-6a20-462e-aa6a-5962f9109650" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: E0121 09:17:22.494911 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76cf94b1-8904-4389-8ef3-8dd36ea02ecf" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.494917 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="76cf94b1-8904-4389-8ef3-8dd36ea02ecf" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.495047 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="a270ff81-6a20-462e-aa6a-5962f9109650" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.495062 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2858edc9-0823-41b9-9c3a-ca8eecb450fa" containerName="mariadb-database-create" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.495076 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="76cf94b1-8904-4389-8ef3-8dd36ea02ecf" containerName="mariadb-account-create-update" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.495574 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.496954 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.497197 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vstgl" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.503960 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wmsxn"] Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.584158 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g8jg6" event={"ID":"a270ff81-6a20-462e-aa6a-5962f9109650","Type":"ContainerDied","Data":"b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887"} Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.584277 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b73d7ad45386d65a0189918f4f6dc7ca6505f3448d5d89c574888fff8a7887" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.584366 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g8jg6" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.598482 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.598569 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.598660 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.598769 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vpst\" (UniqueName: \"kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.699871 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.700027 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpst\" (UniqueName: \"kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.700078 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.700248 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.704273 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.710486 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.712314 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.720924 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpst\" (UniqueName: \"kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst\") pod \"glance-db-sync-wmsxn\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:22 crc kubenswrapper[4618]: I0121 09:17:22.815698 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:23 crc kubenswrapper[4618]: I0121 09:17:23.004825 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:23 crc kubenswrapper[4618]: E0121 09:17:23.005021 4618 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 21 09:17:23 crc kubenswrapper[4618]: E0121 09:17:23.005278 4618 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 21 09:17:23 crc kubenswrapper[4618]: E0121 09:17:23.005320 4618 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift podName:67864b8b-0c06-4f06-8b43-87fcdd8a3d42 nodeName:}" failed. No retries permitted until 2026-01-21 09:17:31.005308118 +0000 UTC m=+849.755775435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift") pod "swift-storage-0" (UID: "67864b8b-0c06-4f06-8b43-87fcdd8a3d42") : configmap "swift-ring-files" not found Jan 21 09:17:23 crc kubenswrapper[4618]: W0121 09:17:23.249655 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb18034_16fd_4cfc_8748_2c58b8584346.slice/crio-00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6 WatchSource:0}: Error finding container 00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6: Status 404 returned error can't find the container with id 00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6 Jan 21 09:17:23 crc kubenswrapper[4618]: I0121 09:17:23.251408 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wmsxn"] Jan 21 09:17:23 crc kubenswrapper[4618]: I0121 09:17:23.252904 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:17:23 crc kubenswrapper[4618]: I0121 09:17:23.590709 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wmsxn" event={"ID":"0fb18034-16fd-4cfc-8748-2c58b8584346","Type":"ContainerStarted","Data":"00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6"} Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.402647 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.444689 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.444901 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="dnsmasq-dns" containerID="cri-o://e0b6379b137637b274434eaf5e9bd8a43f2190a5792c18f0cad01a94fbdb2603" gracePeriod=10 Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.601960 4618 generic.go:334] "Generic (PLEG): container finished" podID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerID="e0b6379b137637b274434eaf5e9bd8a43f2190a5792c18f0cad01a94fbdb2603" exitCode=0 Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.602350 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" event={"ID":"aa6809e9-cec7-46ff-983a-2ae596e84add","Type":"ContainerDied","Data":"e0b6379b137637b274434eaf5e9bd8a43f2190a5792c18f0cad01a94fbdb2603"} Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.604512 4618 generic.go:334] "Generic (PLEG): container finished" podID="a34a0fe1-3391-4b76-8274-d817bcca6d03" containerID="56833f08d16e221c9db181c165afd00f5fbcf02f8c4e2b4d3fbcde896a922ed9" exitCode=0 Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.604541 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l5bzv" event={"ID":"a34a0fe1-3391-4b76-8274-d817bcca6d03","Type":"ContainerDied","Data":"56833f08d16e221c9db181c165afd00f5fbcf02f8c4e2b4d3fbcde896a922ed9"} Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.818746 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.947627 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config\") pod \"aa6809e9-cec7-46ff-983a-2ae596e84add\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.947672 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc\") pod \"aa6809e9-cec7-46ff-983a-2ae596e84add\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.947813 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sthwg\" (UniqueName: \"kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg\") pod \"aa6809e9-cec7-46ff-983a-2ae596e84add\" (UID: \"aa6809e9-cec7-46ff-983a-2ae596e84add\") " Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.953383 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg" (OuterVolumeSpecName: "kube-api-access-sthwg") pod "aa6809e9-cec7-46ff-983a-2ae596e84add" (UID: "aa6809e9-cec7-46ff-983a-2ae596e84add"). InnerVolumeSpecName "kube-api-access-sthwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.979909 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config" (OuterVolumeSpecName: "config") pod "aa6809e9-cec7-46ff-983a-2ae596e84add" (UID: "aa6809e9-cec7-46ff-983a-2ae596e84add"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:24 crc kubenswrapper[4618]: I0121 09:17:24.981651 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa6809e9-cec7-46ff-983a-2ae596e84add" (UID: "aa6809e9-cec7-46ff-983a-2ae596e84add"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.050563 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.050591 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6809e9-cec7-46ff-983a-2ae596e84add-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.050601 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sthwg\" (UniqueName: \"kubernetes.io/projected/aa6809e9-cec7-46ff-983a-2ae596e84add-kube-api-access-sthwg\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.421326 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g8jg6"] Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.426352 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-g8jg6"] Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.551326 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a270ff81-6a20-462e-aa6a-5962f9109650" path="/var/lib/kubelet/pods/a270ff81-6a20-462e-aa6a-5962f9109650/volumes" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.614969 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.615602 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-z8fw4" event={"ID":"aa6809e9-cec7-46ff-983a-2ae596e84add","Type":"ContainerDied","Data":"0a167182c35c73fe86de338cdd40346720ce32b5ab5cf5b40ec93f7620253d7a"} Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.615658 4618 scope.go:117] "RemoveContainer" containerID="e0b6379b137637b274434eaf5e9bd8a43f2190a5792c18f0cad01a94fbdb2603" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.639807 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.645455 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-z8fw4"] Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.647437 4618 scope.go:117] "RemoveContainer" containerID="610ebb6a065b0cc2985c175610107b4f6a79bc053993156cfa26d8d2cf77d4dc" Jan 21 09:17:25 crc kubenswrapper[4618]: I0121 09:17:25.906334 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.022565 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073335 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073444 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073503 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073542 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073611 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zgh7\" (UniqueName: \"kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073670 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.073731 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf\") pod \"a34a0fe1-3391-4b76-8274-d817bcca6d03\" (UID: \"a34a0fe1-3391-4b76-8274-d817bcca6d03\") " Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.074382 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.074467 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.078351 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7" (OuterVolumeSpecName: "kube-api-access-9zgh7") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "kube-api-access-9zgh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.080318 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.090984 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts" (OuterVolumeSpecName: "scripts") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.094817 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.096724 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a34a0fe1-3391-4b76-8274-d817bcca6d03" (UID: "a34a0fe1-3391-4b76-8274-d817bcca6d03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176691 4618 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176714 4618 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176725 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34a0fe1-3391-4b76-8274-d817bcca6d03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176734 4618 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176742 4618 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a34a0fe1-3391-4b76-8274-d817bcca6d03-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176768 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a34a0fe1-3391-4b76-8274-d817bcca6d03-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.176779 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zgh7\" (UniqueName: \"kubernetes.io/projected/a34a0fe1-3391-4b76-8274-d817bcca6d03-kube-api-access-9zgh7\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.623746 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l5bzv" event={"ID":"a34a0fe1-3391-4b76-8274-d817bcca6d03","Type":"ContainerDied","Data":"b115c92ed2df8f530f294bef352f3b6bc510e264468f717e060dab0234cb9b7c"} Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.623788 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b115c92ed2df8f530f294bef352f3b6bc510e264468f717e060dab0234cb9b7c" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.623793 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l5bzv" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.958682 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.958744 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.958815 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.959455 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:17:26 crc kubenswrapper[4618]: I0121 09:17:26.959513 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb" gracePeriod=600 Jan 21 09:17:27 crc kubenswrapper[4618]: I0121 09:17:27.546934 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" path="/var/lib/kubelet/pods/aa6809e9-cec7-46ff-983a-2ae596e84add/volumes" Jan 21 09:17:27 crc kubenswrapper[4618]: I0121 09:17:27.650878 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb" exitCode=0 Jan 21 09:17:27 crc kubenswrapper[4618]: I0121 09:17:27.650920 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb"} Jan 21 09:17:27 crc kubenswrapper[4618]: I0121 09:17:27.650968 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210"} Jan 21 09:17:27 crc kubenswrapper[4618]: I0121 09:17:27.650988 4618 scope.go:117] "RemoveContainer" containerID="54899a279b241edcd830c067b62c5fb70626feb80084fc4b9f8209133774eb23" Jan 21 09:17:29 crc kubenswrapper[4618]: E0121 09:17:29.174555 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.425783 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n4b6j"] Jan 21 09:17:30 crc kubenswrapper[4618]: E0121 09:17:30.426401 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="init" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.426414 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="init" Jan 21 09:17:30 crc kubenswrapper[4618]: E0121 09:17:30.426438 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34a0fe1-3391-4b76-8274-d817bcca6d03" containerName="swift-ring-rebalance" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.426444 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34a0fe1-3391-4b76-8274-d817bcca6d03" containerName="swift-ring-rebalance" Jan 21 09:17:30 crc kubenswrapper[4618]: E0121 09:17:30.426458 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="dnsmasq-dns" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.426463 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="dnsmasq-dns" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.426604 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34a0fe1-3391-4b76-8274-d817bcca6d03" containerName="swift-ring-rebalance" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.426619 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6809e9-cec7-46ff-983a-2ae596e84add" containerName="dnsmasq-dns" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.427101 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.429449 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.431260 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4b6j"] Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.545797 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgxm\" (UniqueName: \"kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.545997 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.647100 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgxm\" (UniqueName: \"kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.647194 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.648928 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.664460 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgxm\" (UniqueName: \"kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm\") pod \"root-account-create-update-n4b6j\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:30 crc kubenswrapper[4618]: I0121 09:17:30.744690 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:31 crc kubenswrapper[4618]: I0121 09:17:31.054233 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:31 crc kubenswrapper[4618]: I0121 09:17:31.062527 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67864b8b-0c06-4f06-8b43-87fcdd8a3d42-etc-swift\") pod \"swift-storage-0\" (UID: \"67864b8b-0c06-4f06-8b43-87fcdd8a3d42\") " pod="openstack/swift-storage-0" Jan 21 09:17:31 crc kubenswrapper[4618]: I0121 09:17:31.082948 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 21 09:17:32 crc kubenswrapper[4618]: I0121 09:17:32.874612 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4b6j"] Jan 21 09:17:32 crc kubenswrapper[4618]: I0121 09:17:32.885645 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.703856 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wmsxn" event={"ID":"0fb18034-16fd-4cfc-8748-2c58b8584346","Type":"ContainerStarted","Data":"b0865635365d32c9c1fe6743f6d8d6b1f3944c3e71f87627a5b0e6f650bd393a"} Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.706324 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"3994b008e50e170959bdc97211e33ee8b6513f39b59220ccfc148a3538e05fc6"} Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.709573 4618 generic.go:334] "Generic (PLEG): container finished" podID="38a90b60-c5b7-409e-b6cd-53e7c1a0006e" containerID="010b11d67fefde8e399456132c1f94a50b6b261395e9da406debcaf9e7b310e2" exitCode=0 Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.709647 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4b6j" event={"ID":"38a90b60-c5b7-409e-b6cd-53e7c1a0006e","Type":"ContainerDied","Data":"010b11d67fefde8e399456132c1f94a50b6b261395e9da406debcaf9e7b310e2"} Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.709668 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4b6j" event={"ID":"38a90b60-c5b7-409e-b6cd-53e7c1a0006e","Type":"ContainerStarted","Data":"eae423aa55845331853406119f0c9f6c02e0895d1b433e6a4849ddb37067aa38"} Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.719085 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wmsxn" podStartSLOduration=2.507076863 podStartE2EDuration="11.719069996s" podCreationTimestamp="2026-01-21 09:17:22 +0000 UTC" firstStartedPulling="2026-01-21 09:17:23.252685704 +0000 UTC m=+842.003153021" lastFinishedPulling="2026-01-21 09:17:32.464678838 +0000 UTC m=+851.215146154" observedRunningTime="2026-01-21 09:17:33.715531 +0000 UTC m=+852.465998317" watchObservedRunningTime="2026-01-21 09:17:33.719069996 +0000 UTC m=+852.469537313" Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.725665 4618 generic.go:334] "Generic (PLEG): container finished" podID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerID="b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8" exitCode=0 Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.725756 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerDied","Data":"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8"} Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.731318 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerID="84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea" exitCode=0 Jan 21 09:17:33 crc kubenswrapper[4618]: I0121 09:17:33.731369 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerDied","Data":"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.738775 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerStarted","Data":"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.739178 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.741593 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"1cd0d4c0d3de5c387bd968fd7786d29992722b052baf53460b37c09abec7bb33"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.741622 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"c92114a7016f80ce42852d9658d81bd61eb78b43530ee842057b8b123c04f370"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.741631 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"e8bc59eefcb950f5693e9dd2213a1499bd2f7a3393efa66b692da62973ae481f"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.741640 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"3324bf6f345bcb1d61c6b391675ae753b19aa0c8d5601914681fabefa08a0e58"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.743318 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerStarted","Data":"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec"} Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.765255 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.018226485 podStartE2EDuration="47.76524287s" podCreationTimestamp="2026-01-21 09:16:47 +0000 UTC" firstStartedPulling="2026-01-21 09:16:55.950767553 +0000 UTC m=+814.701234870" lastFinishedPulling="2026-01-21 09:17:00.697783937 +0000 UTC m=+819.448251255" observedRunningTime="2026-01-21 09:17:34.761826346 +0000 UTC m=+853.512293662" watchObservedRunningTime="2026-01-21 09:17:34.76524287 +0000 UTC m=+853.515710188" Jan 21 09:17:34 crc kubenswrapper[4618]: I0121 09:17:34.788758 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.994392675 podStartE2EDuration="47.788745239s" podCreationTimestamp="2026-01-21 09:16:47 +0000 UTC" firstStartedPulling="2026-01-21 09:16:55.90132594 +0000 UTC m=+814.651793256" lastFinishedPulling="2026-01-21 09:17:00.695678503 +0000 UTC m=+819.446145820" observedRunningTime="2026-01-21 09:17:34.784693009 +0000 UTC m=+853.535160326" watchObservedRunningTime="2026-01-21 09:17:34.788745239 +0000 UTC m=+853.539212557" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.046564 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.228960 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts\") pod \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.229232 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrgxm\" (UniqueName: \"kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm\") pod \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\" (UID: \"38a90b60-c5b7-409e-b6cd-53e7c1a0006e\") " Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.229357 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38a90b60-c5b7-409e-b6cd-53e7c1a0006e" (UID: "38a90b60-c5b7-409e-b6cd-53e7c1a0006e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.229598 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.247638 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm" (OuterVolumeSpecName: "kube-api-access-mrgxm") pod "38a90b60-c5b7-409e-b6cd-53e7c1a0006e" (UID: "38a90b60-c5b7-409e-b6cd-53e7c1a0006e"). InnerVolumeSpecName "kube-api-access-mrgxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.331211 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrgxm\" (UniqueName: \"kubernetes.io/projected/38a90b60-c5b7-409e-b6cd-53e7c1a0006e-kube-api-access-mrgxm\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.750054 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4b6j" event={"ID":"38a90b60-c5b7-409e-b6cd-53e7c1a0006e","Type":"ContainerDied","Data":"eae423aa55845331853406119f0c9f6c02e0895d1b433e6a4849ddb37067aa38"} Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.750351 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae423aa55845331853406119f0c9f6c02e0895d1b433e6a4849ddb37067aa38" Jan 21 09:17:35 crc kubenswrapper[4618]: I0121 09:17:35.750086 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4b6j" Jan 21 09:17:36 crc kubenswrapper[4618]: I0121 09:17:36.761217 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"0aae47e85d459928531b75d3624c650df89d75caf4555507d855353aab961f9b"} Jan 21 09:17:36 crc kubenswrapper[4618]: I0121 09:17:36.761263 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"91a62206df7dd445fcdd2f4cf62605cba63bf09dd46afdaec2d858d9dac6057d"} Jan 21 09:17:36 crc kubenswrapper[4618]: I0121 09:17:36.761275 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"9c8ec9ee94e52e3cbc8bd7b878237c9a7ff42d120def75e0feebb16b10a88b68"} Jan 21 09:17:36 crc kubenswrapper[4618]: I0121 09:17:36.761285 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"150fa780cc0f5d2b3b0658a54fd0e6ee7b88d40c2c28012ad2af4e471371ab21"} Jan 21 09:17:37 crc kubenswrapper[4618]: I0121 09:17:37.776524 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"3fd02d4fb5f6603e1fb6f6a8f1411bf5a93f94609eaa424f491346e7ed43b5ae"} Jan 21 09:17:37 crc kubenswrapper[4618]: I0121 09:17:37.776812 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"606b29a41378daa7abd1608caa1a52b928e54a195e108e6b199e6c9e2dce6cd3"} Jan 21 09:17:37 crc kubenswrapper[4618]: I0121 09:17:37.776831 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"cb698c8b8ff5471d2cd9886f350f04a9f388d7a778b99afa63f477f973e93895"} Jan 21 09:17:37 crc kubenswrapper[4618]: I0121 09:17:37.778357 4618 generic.go:334] "Generic (PLEG): container finished" podID="0fb18034-16fd-4cfc-8748-2c58b8584346" containerID="b0865635365d32c9c1fe6743f6d8d6b1f3944c3e71f87627a5b0e6f650bd393a" exitCode=0 Jan 21 09:17:37 crc kubenswrapper[4618]: I0121 09:17:37.778416 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wmsxn" event={"ID":"0fb18034-16fd-4cfc-8748-2c58b8584346","Type":"ContainerDied","Data":"b0865635365d32c9c1fe6743f6d8d6b1f3944c3e71f87627a5b0e6f650bd393a"} Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.142323 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v4n6j" podUID="a889a44f-3ea7-4b43-b5ea-1f365a9611ac" containerName="ovn-controller" probeResult="failure" output=< Jan 21 09:17:38 crc kubenswrapper[4618]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 09:17:38 crc kubenswrapper[4618]: > Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.158033 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.158270 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z8pqg" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.348709 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v4n6j-config-nhdcz"] Jan 21 09:17:38 crc kubenswrapper[4618]: E0121 09:17:38.349019 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a90b60-c5b7-409e-b6cd-53e7c1a0006e" containerName="mariadb-account-create-update" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.349036 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a90b60-c5b7-409e-b6cd-53e7c1a0006e" containerName="mariadb-account-create-update" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.349205 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a90b60-c5b7-409e-b6cd-53e7c1a0006e" containerName="mariadb-account-create-update" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.349707 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.352915 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.364724 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4n6j-config-nhdcz"] Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485329 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485407 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485515 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqkv\" (UniqueName: \"kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485591 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485720 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.485778 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587304 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqkv\" (UniqueName: \"kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587364 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587416 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587435 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587496 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587522 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587832 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587885 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.587900 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.588207 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.590113 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.606782 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqkv\" (UniqueName: \"kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv\") pod \"ovn-controller-v4n6j-config-nhdcz\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.694335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.808242 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"7706a11f354179ce0cf120a6dcacd19b705c12a8c942074ab21c90f9c9923fd7"} Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.808289 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"6ccd969717f09e2a7fc26606bad007fabd2b995265f7c9a34ac75d15ff0d28b3"} Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.808302 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"d912d8c3f20463fdf99a0695c82ceec6ad90cc183fb69c8d3e72fae632940a08"} Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.808312 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67864b8b-0c06-4f06-8b43-87fcdd8a3d42","Type":"ContainerStarted","Data":"3348b39c05fc13da8103b32f62a88e770cd5f17bf69820254b10f042bd50cfe2"} Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.847034 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.437890358 podStartE2EDuration="24.847018521s" podCreationTimestamp="2026-01-21 09:17:14 +0000 UTC" firstStartedPulling="2026-01-21 09:17:32.901008997 +0000 UTC m=+851.651476315" lastFinishedPulling="2026-01-21 09:17:37.310137161 +0000 UTC m=+856.060604478" observedRunningTime="2026-01-21 09:17:38.842238993 +0000 UTC m=+857.592706310" watchObservedRunningTime="2026-01-21 09:17:38.847018521 +0000 UTC m=+857.597485838" Jan 21 09:17:38 crc kubenswrapper[4618]: I0121 09:17:38.890272 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.117419 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.119834 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.122639 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.138157 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.153438 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4n6j-config-nhdcz"] Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.189198 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.303650 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle\") pod \"0fb18034-16fd-4cfc-8748-2c58b8584346\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.304424 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data\") pod \"0fb18034-16fd-4cfc-8748-2c58b8584346\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.304462 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vpst\" (UniqueName: \"kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst\") pod \"0fb18034-16fd-4cfc-8748-2c58b8584346\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.304508 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data\") pod \"0fb18034-16fd-4cfc-8748-2c58b8584346\" (UID: \"0fb18034-16fd-4cfc-8748-2c58b8584346\") " Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.304938 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdpsd\" (UniqueName: \"kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.304980 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.305018 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.305040 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.305057 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.305086 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.310785 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst" (OuterVolumeSpecName: "kube-api-access-2vpst") pod "0fb18034-16fd-4cfc-8748-2c58b8584346" (UID: "0fb18034-16fd-4cfc-8748-2c58b8584346"). InnerVolumeSpecName "kube-api-access-2vpst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.311126 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0fb18034-16fd-4cfc-8748-2c58b8584346" (UID: "0fb18034-16fd-4cfc-8748-2c58b8584346"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.337730 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb18034-16fd-4cfc-8748-2c58b8584346" (UID: "0fb18034-16fd-4cfc-8748-2c58b8584346"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:39 crc kubenswrapper[4618]: E0121 09:17:39.338734 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.370296 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data" (OuterVolumeSpecName: "config-data") pod "0fb18034-16fd-4cfc-8748-2c58b8584346" (UID: "0fb18034-16fd-4cfc-8748-2c58b8584346"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406201 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406253 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406277 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406351 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406522 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdpsd\" (UniqueName: \"kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.406565 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407079 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407130 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407162 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407174 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vpst\" (UniqueName: \"kubernetes.io/projected/0fb18034-16fd-4cfc-8748-2c58b8584346-kube-api-access-2vpst\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407190 4618 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fb18034-16fd-4cfc-8748-2c58b8584346-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407331 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407476 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407677 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.407729 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.425995 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdpsd\" (UniqueName: \"kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd\") pod \"dnsmasq-dns-8db84466c-xttqj\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.471417 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.819378 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wmsxn" event={"ID":"0fb18034-16fd-4cfc-8748-2c58b8584346","Type":"ContainerDied","Data":"00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6"} Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.819657 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c646d72793b79c17bc1eb21b10182fdcb1c75bd399ac1315e728a01c2939d6" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.819420 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wmsxn" Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.821125 4618 generic.go:334] "Generic (PLEG): container finished" podID="91fa8e23-59d9-4b16-8a34-1ff76bb109c8" containerID="64e52c4c510a390b62d51e3ab84eba1a7d641d00c45adfa8f31062a12177b595" exitCode=0 Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.821335 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4n6j-config-nhdcz" event={"ID":"91fa8e23-59d9-4b16-8a34-1ff76bb109c8","Type":"ContainerDied","Data":"64e52c4c510a390b62d51e3ab84eba1a7d641d00c45adfa8f31062a12177b595"} Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.821378 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4n6j-config-nhdcz" event={"ID":"91fa8e23-59d9-4b16-8a34-1ff76bb109c8","Type":"ContainerStarted","Data":"4900d520775dc47deeb2e21a525306b3af9d507409c1ac5321adaabb9e16c0d8"} Jan 21 09:17:39 crc kubenswrapper[4618]: I0121 09:17:39.868356 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.160682 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.235417 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:17:40 crc kubenswrapper[4618]: E0121 09:17:40.235903 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb18034-16fd-4cfc-8748-2c58b8584346" containerName="glance-db-sync" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.235925 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb18034-16fd-4cfc-8748-2c58b8584346" containerName="glance-db-sync" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.236186 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb18034-16fd-4cfc-8748-2c58b8584346" containerName="glance-db-sync" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.237411 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.247241 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.323970 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.324289 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.324331 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.324370 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.324406 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.324456 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8spjk\" (UniqueName: \"kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426173 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426257 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8spjk\" (UniqueName: \"kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426306 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426346 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426371 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.426410 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.427349 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.427356 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.427431 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.428278 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.428294 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.442536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8spjk\" (UniqueName: \"kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk\") pod \"dnsmasq-dns-74dfc89d77-mvdfm\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.556156 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.828667 4618 generic.go:334] "Generic (PLEG): container finished" podID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" containerID="d0814e94e7bfe9e2d03bd564c0367fdf7cb80b0ebdf1777a09a2b40e62c9c611" exitCode=0 Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.828778 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-xttqj" event={"ID":"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f","Type":"ContainerDied","Data":"d0814e94e7bfe9e2d03bd564c0367fdf7cb80b0ebdf1777a09a2b40e62c9c611"} Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.828911 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-xttqj" event={"ID":"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f","Type":"ContainerStarted","Data":"0f57dac5f5d038f85e960db3451e43491169a4f61d2b27ae47f0ea76c00ee994"} Jan 21 09:17:40 crc kubenswrapper[4618]: I0121 09:17:40.996284 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:17:41 crc kubenswrapper[4618]: W0121 09:17:41.006098 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a18a3f_1479_4331_88b7_ca75b69d1187.slice/crio-cb67e3bf0c56fc86c31c26c08a2ae96db8259f125e7e58f62d11c0eb04c910a6 WatchSource:0}: Error finding container cb67e3bf0c56fc86c31c26c08a2ae96db8259f125e7e58f62d11c0eb04c910a6: Status 404 returned error can't find the container with id cb67e3bf0c56fc86c31c26c08a2ae96db8259f125e7e58f62d11c0eb04c910a6 Jan 21 09:17:41 crc kubenswrapper[4618]: E0121 09:17:41.028997 4618 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 21 09:17:41 crc kubenswrapper[4618]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 21 09:17:41 crc kubenswrapper[4618]: > podSandboxID="0f57dac5f5d038f85e960db3451e43491169a4f61d2b27ae47f0ea76c00ee994" Jan 21 09:17:41 crc kubenswrapper[4618]: E0121 09:17:41.029210 4618 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 21 09:17:41 crc kubenswrapper[4618]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66chbbh56dh7fhfh68chf9hfdhbdh587h5b9h568h68fh77h5b5h559h577h687h574h5d5h584h8chd9hb4h66h566h545h699h564h568h66fhc9q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdpsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8db84466c-xttqj_openstack(4577c1ce-3eda-4b8b-bb06-2e2e3869a82f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 21 09:17:41 crc kubenswrapper[4618]: > logger="UnhandledError" Jan 21 09:17:41 crc kubenswrapper[4618]: E0121 09:17:41.030345 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8db84466c-xttqj" podUID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.099461 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241135 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241202 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241242 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grqkv\" (UniqueName: \"kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241270 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241266 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241351 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241484 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts\") pod \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\" (UID: \"91fa8e23-59d9-4b16-8a34-1ff76bb109c8\") " Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241546 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241563 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run" (OuterVolumeSpecName: "var-run") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241853 4618 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241872 4618 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.241883 4618 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.242131 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.242279 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts" (OuterVolumeSpecName: "scripts") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.245195 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv" (OuterVolumeSpecName: "kube-api-access-grqkv") pod "91fa8e23-59d9-4b16-8a34-1ff76bb109c8" (UID: "91fa8e23-59d9-4b16-8a34-1ff76bb109c8"). InnerVolumeSpecName "kube-api-access-grqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.343442 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.343881 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grqkv\" (UniqueName: \"kubernetes.io/projected/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-kube-api-access-grqkv\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.343944 4618 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91fa8e23-59d9-4b16-8a34-1ff76bb109c8-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.840131 4618 generic.go:334] "Generic (PLEG): container finished" podID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerID="3e267a047f1b12fc59aec5c16292a075e33b49ab2af6ea0ee7d58412fc8f894e" exitCode=0 Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.840216 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" event={"ID":"97a18a3f-1479-4331-88b7-ca75b69d1187","Type":"ContainerDied","Data":"3e267a047f1b12fc59aec5c16292a075e33b49ab2af6ea0ee7d58412fc8f894e"} Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.840325 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" event={"ID":"97a18a3f-1479-4331-88b7-ca75b69d1187","Type":"ContainerStarted","Data":"cb67e3bf0c56fc86c31c26c08a2ae96db8259f125e7e58f62d11c0eb04c910a6"} Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.843023 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4n6j-config-nhdcz" Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.843221 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4n6j-config-nhdcz" event={"ID":"91fa8e23-59d9-4b16-8a34-1ff76bb109c8","Type":"ContainerDied","Data":"4900d520775dc47deeb2e21a525306b3af9d507409c1ac5321adaabb9e16c0d8"} Jan 21 09:17:41 crc kubenswrapper[4618]: I0121 09:17:41.843271 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4900d520775dc47deeb2e21a525306b3af9d507409c1ac5321adaabb9e16c0d8" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.104758 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.188432 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v4n6j-config-nhdcz"] Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.196569 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v4n6j-config-nhdcz"] Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.259698 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.259769 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.259802 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.259906 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.260121 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdpsd\" (UniqueName: \"kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.260189 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb\") pod \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\" (UID: \"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f\") " Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.265921 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd" (OuterVolumeSpecName: "kube-api-access-tdpsd") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "kube-api-access-tdpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.297472 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.298447 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.299208 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.301585 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.303727 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config" (OuterVolumeSpecName: "config") pod "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" (UID: "4577c1ce-3eda-4b8b-bb06-2e2e3869a82f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363529 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdpsd\" (UniqueName: \"kubernetes.io/projected/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-kube-api-access-tdpsd\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363567 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363583 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363593 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363606 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.363614 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.852072 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" event={"ID":"97a18a3f-1479-4331-88b7-ca75b69d1187","Type":"ContainerStarted","Data":"d72d9fd0db4bd4a5fa233b9faa4bcdd0adf4fd93a712b7dbf73f72a108c5a4c4"} Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.852699 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.853756 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8db84466c-xttqj" event={"ID":"4577c1ce-3eda-4b8b-bb06-2e2e3869a82f","Type":"ContainerDied","Data":"0f57dac5f5d038f85e960db3451e43491169a4f61d2b27ae47f0ea76c00ee994"} Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.853806 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8db84466c-xttqj" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.853815 4618 scope.go:117] "RemoveContainer" containerID="d0814e94e7bfe9e2d03bd564c0367fdf7cb80b0ebdf1777a09a2b40e62c9c611" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.880840 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" podStartSLOduration=2.880821263 podStartE2EDuration="2.880821263s" podCreationTimestamp="2026-01-21 09:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:17:42.874458396 +0000 UTC m=+861.624925713" watchObservedRunningTime="2026-01-21 09:17:42.880821263 +0000 UTC m=+861.631288580" Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.934201 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:42 crc kubenswrapper[4618]: I0121 09:17:42.940214 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8db84466c-xttqj"] Jan 21 09:17:43 crc kubenswrapper[4618]: I0121 09:17:43.111879 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-v4n6j" Jan 21 09:17:43 crc kubenswrapper[4618]: I0121 09:17:43.545818 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" path="/var/lib/kubelet/pods/4577c1ce-3eda-4b8b-bb06-2e2e3869a82f/volumes" Jan 21 09:17:43 crc kubenswrapper[4618]: I0121 09:17:43.546788 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fa8e23-59d9-4b16-8a34-1ff76bb109c8" path="/var/lib/kubelet/pods/91fa8e23-59d9-4b16-8a34-1ff76bb109c8/volumes" Jan 21 09:17:48 crc kubenswrapper[4618]: I0121 09:17:48.892429 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.119369 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-p4qcp"] Jan 21 09:17:49 crc kubenswrapper[4618]: E0121 09:17:49.119812 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fa8e23-59d9-4b16-8a34-1ff76bb109c8" containerName="ovn-config" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.119892 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fa8e23-59d9-4b16-8a34-1ff76bb109c8" containerName="ovn-config" Jan 21 09:17:49 crc kubenswrapper[4618]: E0121 09:17:49.119961 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" containerName="init" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.120014 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" containerName="init" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.120217 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="4577c1ce-3eda-4b8b-bb06-2e2e3869a82f" containerName="init" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.120350 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fa8e23-59d9-4b16-8a34-1ff76bb109c8" containerName="ovn-config" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.120848 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.139342 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.139537 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p4qcp"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.210669 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-w578p"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.211628 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.220798 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-dfce-account-create-update-l66gf"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.221680 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.229459 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.235378 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dfce-account-create-update-l66gf"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.256327 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w578p"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.277466 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.277616 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b47qb\" (UniqueName: \"kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.326198 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-efa7-account-create-update-btb2b"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.327122 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.329105 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.341924 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-efa7-account-create-update-btb2b"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.379534 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.379765 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.379848 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.379927 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b47qb\" (UniqueName: \"kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.380044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jpf\" (UniqueName: \"kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.380133 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8vhh\" (UniqueName: \"kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.380255 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.430773 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b47qb\" (UniqueName: \"kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb\") pod \"cinder-db-create-p4qcp\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.440450 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482052 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482548 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jpf\" (UniqueName: \"kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482653 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8vhh\" (UniqueName: \"kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482747 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qh5h\" (UniqueName: \"kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482843 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482912 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.483592 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.482171 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mglts"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.504873 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.506213 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mglts"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.522470 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jpf\" (UniqueName: \"kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf\") pod \"cinder-dfce-account-create-update-l66gf\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.563225 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.584630 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8vhh\" (UniqueName: \"kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.586052 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmwr\" (UniqueName: \"kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.586110 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qh5h\" (UniqueName: \"kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.586371 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.586425 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.588590 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.598937 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts\") pod \"barbican-db-create-w578p\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.618628 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qh5h\" (UniqueName: \"kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h\") pod \"barbican-efa7-account-create-update-btb2b\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.623661 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mdmw7"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.624672 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mdmw7"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.624741 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.628093 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.628273 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.628502 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.628612 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l8whb" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.635878 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-00df-account-create-update-d7dj6"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.637012 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.639499 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.639776 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.654716 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-00df-account-create-update-d7dj6"] Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.689098 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.689190 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.689775 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.689833 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmwr\" (UniqueName: \"kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.690114 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xccvl\" (UniqueName: \"kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.690184 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.690310 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.690327 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fkjp\" (UniqueName: \"kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: E0121 09:17:49.707591 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.708080 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmwr\" (UniqueName: \"kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr\") pod \"neutron-db-create-mglts\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.791600 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xccvl\" (UniqueName: \"kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.791656 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.791721 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fkjp\" (UniqueName: \"kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.791749 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.791778 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.793744 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.796067 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.796108 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.810532 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fkjp\" (UniqueName: \"kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp\") pod \"neutron-00df-account-create-update-d7dj6\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.812565 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xccvl\" (UniqueName: \"kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl\") pod \"keystone-db-sync-mdmw7\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.834612 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w578p" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.948449 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mglts" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.961701 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:49 crc kubenswrapper[4618]: I0121 09:17:49.968719 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.044053 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p4qcp"] Jan 21 09:17:50 crc kubenswrapper[4618]: W0121 09:17:50.049652 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2835ff70_c16c_42dd_8971_f3cfd0ae800f.slice/crio-94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7 WatchSource:0}: Error finding container 94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7: Status 404 returned error can't find the container with id 94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.094541 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dfce-account-create-update-l66gf"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.151887 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-efa7-account-create-update-btb2b"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.197018 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mglts"] Jan 21 09:17:50 crc kubenswrapper[4618]: W0121 09:17:50.212785 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0ff954b_927c_4066_b75d_0e0f4530f888.slice/crio-a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb WatchSource:0}: Error finding container a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb: Status 404 returned error can't find the container with id a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.250124 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w578p"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.439605 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mdmw7"] Jan 21 09:17:50 crc kubenswrapper[4618]: W0121 09:17:50.464443 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode651bca2_bb3f_4946_a656_8900b6c25427.slice/crio-b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c WatchSource:0}: Error finding container b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c: Status 404 returned error can't find the container with id b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.521490 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-00df-account-create-update-d7dj6"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.527579 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.528992 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: W0121 09:17:50.529331 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9359c136_cac8_46b2_b341_b893187d9476.slice/crio-8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0 WatchSource:0}: Error finding container 8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0: Status 404 returned error can't find the container with id 8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.549237 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.558269 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.602907 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdc5\" (UniqueName: \"kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.602948 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.603097 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.633063 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.633278 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="dnsmasq-dns" containerID="cri-o://c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204" gracePeriod=10 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.705244 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdc5\" (UniqueName: \"kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.705295 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.705350 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.705857 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.705995 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.722977 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdc5\" (UniqueName: \"kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5\") pod \"redhat-marketplace-vmx97\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.836828 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.920866 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w578p" event={"ID":"a37791f6-6d08-4964-9ac8-c9d99f04979c","Type":"ContainerStarted","Data":"ede8bfe8c1f35eda989d6db1469c3ab0c187fbfcdfac2df094785949c48196e3"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.920914 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w578p" event={"ID":"a37791f6-6d08-4964-9ac8-c9d99f04979c","Type":"ContainerStarted","Data":"bfc725b5516191b4eddeba5edd293760d286c6f9707ac28067b77d3489ef1d55"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.922483 4618 generic.go:334] "Generic (PLEG): container finished" podID="1e120b8a-c3f7-41ec-8987-a8b74198bc74" containerID="58bc2e013ab44651fb62f3d974e744d7e6980163e2de14a29cf610864a23d4cd" exitCode=0 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.922643 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dfce-account-create-update-l66gf" event={"ID":"1e120b8a-c3f7-41ec-8987-a8b74198bc74","Type":"ContainerDied","Data":"58bc2e013ab44651fb62f3d974e744d7e6980163e2de14a29cf610864a23d4cd"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.922666 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dfce-account-create-update-l66gf" event={"ID":"1e120b8a-c3f7-41ec-8987-a8b74198bc74","Type":"ContainerStarted","Data":"a74c98cdc3a59d69cfceb345c039296a3fd5cae583a1c655cff364b8cbe6d90b"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.923780 4618 generic.go:334] "Generic (PLEG): container finished" podID="2835ff70-c16c-42dd-8971-f3cfd0ae800f" containerID="ff76ee5bc7affd2950b673c6ee148d97c0543b6ac93970aedab3ced0c91a3005" exitCode=0 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.923846 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p4qcp" event={"ID":"2835ff70-c16c-42dd-8971-f3cfd0ae800f","Type":"ContainerDied","Data":"ff76ee5bc7affd2950b673c6ee148d97c0543b6ac93970aedab3ced0c91a3005"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.923871 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p4qcp" event={"ID":"2835ff70-c16c-42dd-8971-f3cfd0ae800f","Type":"ContainerStarted","Data":"94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.926484 4618 generic.go:334] "Generic (PLEG): container finished" podID="a0ff954b-927c-4066-b75d-0e0f4530f888" containerID="e94ac90fbc6029d8088933b785aa87f1603f41efe025bde503fa963ec1a850df" exitCode=0 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.926575 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mglts" event={"ID":"a0ff954b-927c-4066-b75d-0e0f4530f888","Type":"ContainerDied","Data":"e94ac90fbc6029d8088933b785aa87f1603f41efe025bde503fa963ec1a850df"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.926609 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mglts" event={"ID":"a0ff954b-927c-4066-b75d-0e0f4530f888","Type":"ContainerStarted","Data":"a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.927449 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-00df-account-create-update-d7dj6" event={"ID":"9359c136-cac8-46b2-b341-b893187d9476","Type":"ContainerStarted","Data":"8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.928329 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mdmw7" event={"ID":"e651bca2-bb3f-4946-a656-8900b6c25427","Type":"ContainerStarted","Data":"b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.929515 4618 generic.go:334] "Generic (PLEG): container finished" podID="e3cd0644-db42-42d1-884b-8c341d4eb1c9" containerID="351aa67e75091cda85e1f589ab12fe5602e0ddd1793e747e581af3b2c01baee2" exitCode=0 Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.929547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-efa7-account-create-update-btb2b" event={"ID":"e3cd0644-db42-42d1-884b-8c341d4eb1c9","Type":"ContainerDied","Data":"351aa67e75091cda85e1f589ab12fe5602e0ddd1793e747e581af3b2c01baee2"} Jan 21 09:17:50 crc kubenswrapper[4618]: I0121 09:17:50.929565 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-efa7-account-create-update-btb2b" event={"ID":"e3cd0644-db42-42d1-884b-8c341d4eb1c9","Type":"ContainerStarted","Data":"1cf39fcb2b9468f77b56b0e17a7cc14564a6db51fe1a715a484b6fe0c58371f0"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.007628 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-w578p" podStartSLOduration=2.007609427 podStartE2EDuration="2.007609427s" podCreationTimestamp="2026-01-21 09:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:17:50.944381778 +0000 UTC m=+869.694849095" watchObservedRunningTime="2026-01-21 09:17:51.007609427 +0000 UTC m=+869.758076744" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.314086 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:17:51 crc kubenswrapper[4618]: W0121 09:17:51.316372 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9e4105_9c76_46df_a046_76f2efb55036.slice/crio-cb4025302e64ad7f7563d3da97382f84c31ef7f961bbdec9fb547a6bb4ecb3e4 WatchSource:0}: Error finding container cb4025302e64ad7f7563d3da97382f84c31ef7f961bbdec9fb547a6bb4ecb3e4: Status 404 returned error can't find the container with id cb4025302e64ad7f7563d3da97382f84c31ef7f961bbdec9fb547a6bb4ecb3e4 Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.631304 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.740126 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc\") pod \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.740327 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config\") pod \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.740385 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2cq6\" (UniqueName: \"kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6\") pod \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.740562 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb\") pod \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.740611 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb\") pod \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\" (UID: \"b1bfd2f4-8562-470f-b4ec-ac051f8565d1\") " Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.751276 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6" (OuterVolumeSpecName: "kube-api-access-h2cq6") pod "b1bfd2f4-8562-470f-b4ec-ac051f8565d1" (UID: "b1bfd2f4-8562-470f-b4ec-ac051f8565d1"). InnerVolumeSpecName "kube-api-access-h2cq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.781797 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1bfd2f4-8562-470f-b4ec-ac051f8565d1" (UID: "b1bfd2f4-8562-470f-b4ec-ac051f8565d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.790398 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1bfd2f4-8562-470f-b4ec-ac051f8565d1" (UID: "b1bfd2f4-8562-470f-b4ec-ac051f8565d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.803632 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config" (OuterVolumeSpecName: "config") pod "b1bfd2f4-8562-470f-b4ec-ac051f8565d1" (UID: "b1bfd2f4-8562-470f-b4ec-ac051f8565d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.833822 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1bfd2f4-8562-470f-b4ec-ac051f8565d1" (UID: "b1bfd2f4-8562-470f-b4ec-ac051f8565d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.843235 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.843259 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2cq6\" (UniqueName: \"kubernetes.io/projected/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-kube-api-access-h2cq6\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.843271 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.843281 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.843288 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1bfd2f4-8562-470f-b4ec-ac051f8565d1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.940609 4618 generic.go:334] "Generic (PLEG): container finished" podID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerID="c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204" exitCode=0 Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.940682 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" event={"ID":"b1bfd2f4-8562-470f-b4ec-ac051f8565d1","Type":"ContainerDied","Data":"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.940751 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" event={"ID":"b1bfd2f4-8562-470f-b4ec-ac051f8565d1","Type":"ContainerDied","Data":"c2797755ab18079c42f6fcb2b20975c2832985f25c256c49d85d24b28d45ed0f"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.940773 4618 scope.go:117] "RemoveContainer" containerID="c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.940815 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-qcpxr" Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.950881 4618 generic.go:334] "Generic (PLEG): container finished" podID="af9e4105-9c76-46df-a046-76f2efb55036" containerID="c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201" exitCode=0 Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.950914 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerDied","Data":"c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.950957 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerStarted","Data":"cb4025302e64ad7f7563d3da97382f84c31ef7f961bbdec9fb547a6bb4ecb3e4"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.954698 4618 generic.go:334] "Generic (PLEG): container finished" podID="a37791f6-6d08-4964-9ac8-c9d99f04979c" containerID="ede8bfe8c1f35eda989d6db1469c3ab0c187fbfcdfac2df094785949c48196e3" exitCode=0 Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.954793 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w578p" event={"ID":"a37791f6-6d08-4964-9ac8-c9d99f04979c","Type":"ContainerDied","Data":"ede8bfe8c1f35eda989d6db1469c3ab0c187fbfcdfac2df094785949c48196e3"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.957372 4618 generic.go:334] "Generic (PLEG): container finished" podID="9359c136-cac8-46b2-b341-b893187d9476" containerID="0d433ae38db2df7bbdad947799db2a7da6290b91bedcf10ead42729f9839d023" exitCode=0 Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.957460 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-00df-account-create-update-d7dj6" event={"ID":"9359c136-cac8-46b2-b341-b893187d9476","Type":"ContainerDied","Data":"0d433ae38db2df7bbdad947799db2a7da6290b91bedcf10ead42729f9839d023"} Jan 21 09:17:51 crc kubenswrapper[4618]: I0121 09:17:51.986904 4618 scope.go:117] "RemoveContainer" containerID="459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.016218 4618 scope.go:117] "RemoveContainer" containerID="c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204" Jan 21 09:17:52 crc kubenswrapper[4618]: E0121 09:17:52.016568 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204\": container with ID starting with c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204 not found: ID does not exist" containerID="c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.016604 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204"} err="failed to get container status \"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204\": rpc error: code = NotFound desc = could not find container \"c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204\": container with ID starting with c9d1424f5d44b6047bb9a55d47aaf2cdb51e0bd8e3d08177724b22423a7f7204 not found: ID does not exist" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.016627 4618 scope.go:117] "RemoveContainer" containerID="459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd" Jan 21 09:17:52 crc kubenswrapper[4618]: E0121 09:17:52.017786 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd\": container with ID starting with 459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd not found: ID does not exist" containerID="459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.017821 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd"} err="failed to get container status \"459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd\": rpc error: code = NotFound desc = could not find container \"459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd\": container with ID starting with 459c223a7105b944898c51afeed19e9118e97dfc3982e24f5df2c9da700ab1fd not found: ID does not exist" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.032892 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.038779 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-qcpxr"] Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.292431 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.339455 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.345967 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.353456 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mglts" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453249 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts\") pod \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453339 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jpf\" (UniqueName: \"kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf\") pod \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453462 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts\") pod \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453503 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts\") pod \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\" (UID: \"1e120b8a-c3f7-41ec-8987-a8b74198bc74\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453564 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b47qb\" (UniqueName: \"kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb\") pod \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\" (UID: \"2835ff70-c16c-42dd-8971-f3cfd0ae800f\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.453617 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qh5h\" (UniqueName: \"kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h\") pod \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\" (UID: \"e3cd0644-db42-42d1-884b-8c341d4eb1c9\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.454402 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3cd0644-db42-42d1-884b-8c341d4eb1c9" (UID: "e3cd0644-db42-42d1-884b-8c341d4eb1c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.454473 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2835ff70-c16c-42dd-8971-f3cfd0ae800f" (UID: "2835ff70-c16c-42dd-8971-f3cfd0ae800f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.454883 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e120b8a-c3f7-41ec-8987-a8b74198bc74" (UID: "1e120b8a-c3f7-41ec-8987-a8b74198bc74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.460044 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h" (OuterVolumeSpecName: "kube-api-access-8qh5h") pod "e3cd0644-db42-42d1-884b-8c341d4eb1c9" (UID: "e3cd0644-db42-42d1-884b-8c341d4eb1c9"). InnerVolumeSpecName "kube-api-access-8qh5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.460586 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf" (OuterVolumeSpecName: "kube-api-access-s4jpf") pod "1e120b8a-c3f7-41ec-8987-a8b74198bc74" (UID: "1e120b8a-c3f7-41ec-8987-a8b74198bc74"). InnerVolumeSpecName "kube-api-access-s4jpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.465265 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb" (OuterVolumeSpecName: "kube-api-access-b47qb") pod "2835ff70-c16c-42dd-8971-f3cfd0ae800f" (UID: "2835ff70-c16c-42dd-8971-f3cfd0ae800f"). InnerVolumeSpecName "kube-api-access-b47qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.554880 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts\") pod \"a0ff954b-927c-4066-b75d-0e0f4530f888\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.555605 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0ff954b-927c-4066-b75d-0e0f4530f888" (UID: "a0ff954b-927c-4066-b75d-0e0f4530f888"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.557452 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmwr\" (UniqueName: \"kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr\") pod \"a0ff954b-927c-4066-b75d-0e0f4530f888\" (UID: \"a0ff954b-927c-4066-b75d-0e0f4530f888\") " Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559302 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2835ff70-c16c-42dd-8971-f3cfd0ae800f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559386 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e120b8a-c3f7-41ec-8987-a8b74198bc74-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559830 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b47qb\" (UniqueName: \"kubernetes.io/projected/2835ff70-c16c-42dd-8971-f3cfd0ae800f-kube-api-access-b47qb\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559854 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0ff954b-927c-4066-b75d-0e0f4530f888-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559866 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qh5h\" (UniqueName: \"kubernetes.io/projected/e3cd0644-db42-42d1-884b-8c341d4eb1c9-kube-api-access-8qh5h\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559885 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3cd0644-db42-42d1-884b-8c341d4eb1c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.559896 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jpf\" (UniqueName: \"kubernetes.io/projected/1e120b8a-c3f7-41ec-8987-a8b74198bc74-kube-api-access-s4jpf\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.562393 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr" (OuterVolumeSpecName: "kube-api-access-9dmwr") pod "a0ff954b-927c-4066-b75d-0e0f4530f888" (UID: "a0ff954b-927c-4066-b75d-0e0f4530f888"). InnerVolumeSpecName "kube-api-access-9dmwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.662550 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dmwr\" (UniqueName: \"kubernetes.io/projected/a0ff954b-927c-4066-b75d-0e0f4530f888-kube-api-access-9dmwr\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.968502 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dfce-account-create-update-l66gf" event={"ID":"1e120b8a-c3f7-41ec-8987-a8b74198bc74","Type":"ContainerDied","Data":"a74c98cdc3a59d69cfceb345c039296a3fd5cae583a1c655cff364b8cbe6d90b"} Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.968796 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74c98cdc3a59d69cfceb345c039296a3fd5cae583a1c655cff364b8cbe6d90b" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.968871 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dfce-account-create-update-l66gf" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.971125 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p4qcp" event={"ID":"2835ff70-c16c-42dd-8971-f3cfd0ae800f","Type":"ContainerDied","Data":"94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7"} Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.971286 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94860c078c7fb02df5adaebc2bc8d7484bc07945d5b5afc4f123775960b2cad7" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.971365 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p4qcp" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.974371 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mglts" event={"ID":"a0ff954b-927c-4066-b75d-0e0f4530f888","Type":"ContainerDied","Data":"a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb"} Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.974424 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ebc53a56a45aeee5608e13151f31a99c3101d2a628c9b00d19b7571611b2eb" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.974471 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mglts" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.976156 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-efa7-account-create-update-btb2b" event={"ID":"e3cd0644-db42-42d1-884b-8c341d4eb1c9","Type":"ContainerDied","Data":"1cf39fcb2b9468f77b56b0e17a7cc14564a6db51fe1a715a484b6fe0c58371f0"} Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.976197 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf39fcb2b9468f77b56b0e17a7cc14564a6db51fe1a715a484b6fe0c58371f0" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.976287 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-efa7-account-create-update-btb2b" Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.980781 4618 generic.go:334] "Generic (PLEG): container finished" podID="af9e4105-9c76-46df-a046-76f2efb55036" containerID="29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2" exitCode=0 Jan 21 09:17:52 crc kubenswrapper[4618]: I0121 09:17:52.981636 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerDied","Data":"29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2"} Jan 21 09:17:53 crc kubenswrapper[4618]: I0121 09:17:53.556476 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" path="/var/lib/kubelet/pods/b1bfd2f4-8562-470f-b4ec-ac051f8565d1/volumes" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.347949 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.351272 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w578p" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.520379 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fkjp\" (UniqueName: \"kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp\") pod \"9359c136-cac8-46b2-b341-b893187d9476\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.520500 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts\") pod \"a37791f6-6d08-4964-9ac8-c9d99f04979c\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.520528 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts\") pod \"9359c136-cac8-46b2-b341-b893187d9476\" (UID: \"9359c136-cac8-46b2-b341-b893187d9476\") " Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.520616 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8vhh\" (UniqueName: \"kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh\") pod \"a37791f6-6d08-4964-9ac8-c9d99f04979c\" (UID: \"a37791f6-6d08-4964-9ac8-c9d99f04979c\") " Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.521243 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a37791f6-6d08-4964-9ac8-c9d99f04979c" (UID: "a37791f6-6d08-4964-9ac8-c9d99f04979c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.521339 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9359c136-cac8-46b2-b341-b893187d9476" (UID: "9359c136-cac8-46b2-b341-b893187d9476"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.524501 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh" (OuterVolumeSpecName: "kube-api-access-q8vhh") pod "a37791f6-6d08-4964-9ac8-c9d99f04979c" (UID: "a37791f6-6d08-4964-9ac8-c9d99f04979c"). InnerVolumeSpecName "kube-api-access-q8vhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.524979 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp" (OuterVolumeSpecName: "kube-api-access-2fkjp") pod "9359c136-cac8-46b2-b341-b893187d9476" (UID: "9359c136-cac8-46b2-b341-b893187d9476"). InnerVolumeSpecName "kube-api-access-2fkjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.622356 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37791f6-6d08-4964-9ac8-c9d99f04979c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.622386 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9359c136-cac8-46b2-b341-b893187d9476-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.622397 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8vhh\" (UniqueName: \"kubernetes.io/projected/a37791f6-6d08-4964-9ac8-c9d99f04979c-kube-api-access-q8vhh\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:56 crc kubenswrapper[4618]: I0121 09:17:56.622411 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fkjp\" (UniqueName: \"kubernetes.io/projected/9359c136-cac8-46b2-b341-b893187d9476-kube-api-access-2fkjp\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.014832 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w578p" event={"ID":"a37791f6-6d08-4964-9ac8-c9d99f04979c","Type":"ContainerDied","Data":"bfc725b5516191b4eddeba5edd293760d286c6f9707ac28067b77d3489ef1d55"} Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.015107 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc725b5516191b4eddeba5edd293760d286c6f9707ac28067b77d3489ef1d55" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.015169 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w578p" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.021711 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-00df-account-create-update-d7dj6" event={"ID":"9359c136-cac8-46b2-b341-b893187d9476","Type":"ContainerDied","Data":"8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0"} Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.021756 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b65c602e5a821f494aeb36fe04a94932925fabe28b4c94f64c6d91317cdaeb0" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.021721 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-00df-account-create-update-d7dj6" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.023662 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mdmw7" event={"ID":"e651bca2-bb3f-4946-a656-8900b6c25427","Type":"ContainerStarted","Data":"8b76966ef840de994e421729c0767fcf908c1cf8cd0b678fdea4b8d8971b5fb4"} Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.026195 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerStarted","Data":"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9"} Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.040015 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mdmw7" podStartSLOduration=2.262791391 podStartE2EDuration="8.040005322s" podCreationTimestamp="2026-01-21 09:17:49 +0000 UTC" firstStartedPulling="2026-01-21 09:17:50.472661522 +0000 UTC m=+869.223128839" lastFinishedPulling="2026-01-21 09:17:56.249875453 +0000 UTC m=+875.000342770" observedRunningTime="2026-01-21 09:17:57.039440481 +0000 UTC m=+875.789907797" watchObservedRunningTime="2026-01-21 09:17:57.040005322 +0000 UTC m=+875.790472640" Jan 21 09:17:57 crc kubenswrapper[4618]: I0121 09:17:57.072579 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vmx97" podStartSLOduration=2.321820645 podStartE2EDuration="7.072565797s" podCreationTimestamp="2026-01-21 09:17:50 +0000 UTC" firstStartedPulling="2026-01-21 09:17:51.953823555 +0000 UTC m=+870.704290872" lastFinishedPulling="2026-01-21 09:17:56.704568708 +0000 UTC m=+875.455036024" observedRunningTime="2026-01-21 09:17:57.069107543 +0000 UTC m=+875.819574860" watchObservedRunningTime="2026-01-21 09:17:57.072565797 +0000 UTC m=+875.823033115" Jan 21 09:17:58 crc kubenswrapper[4618]: I0121 09:17:58.034828 4618 generic.go:334] "Generic (PLEG): container finished" podID="e651bca2-bb3f-4946-a656-8900b6c25427" containerID="8b76966ef840de994e421729c0767fcf908c1cf8cd0b678fdea4b8d8971b5fb4" exitCode=0 Jan 21 09:17:58 crc kubenswrapper[4618]: I0121 09:17:58.034925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mdmw7" event={"ID":"e651bca2-bb3f-4946-a656-8900b6c25427","Type":"ContainerDied","Data":"8b76966ef840de994e421729c0767fcf908c1cf8cd0b678fdea4b8d8971b5fb4"} Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.313043 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.461792 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data\") pod \"e651bca2-bb3f-4946-a656-8900b6c25427\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.461826 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xccvl\" (UniqueName: \"kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl\") pod \"e651bca2-bb3f-4946-a656-8900b6c25427\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.461884 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle\") pod \"e651bca2-bb3f-4946-a656-8900b6c25427\" (UID: \"e651bca2-bb3f-4946-a656-8900b6c25427\") " Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.469664 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl" (OuterVolumeSpecName: "kube-api-access-xccvl") pod "e651bca2-bb3f-4946-a656-8900b6c25427" (UID: "e651bca2-bb3f-4946-a656-8900b6c25427"). InnerVolumeSpecName "kube-api-access-xccvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.489289 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e651bca2-bb3f-4946-a656-8900b6c25427" (UID: "e651bca2-bb3f-4946-a656-8900b6c25427"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.498903 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data" (OuterVolumeSpecName: "config-data") pod "e651bca2-bb3f-4946-a656-8900b6c25427" (UID: "e651bca2-bb3f-4946-a656-8900b6c25427"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.563905 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.563927 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e651bca2-bb3f-4946-a656-8900b6c25427-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:59 crc kubenswrapper[4618]: I0121 09:17:59.563938 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xccvl\" (UniqueName: \"kubernetes.io/projected/e651bca2-bb3f-4946-a656-8900b6c25427-kube-api-access-xccvl\") on node \"crc\" DevicePath \"\"" Jan 21 09:17:59 crc kubenswrapper[4618]: E0121 09:17:59.897743 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.050078 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mdmw7" event={"ID":"e651bca2-bb3f-4946-a656-8900b6c25427","Type":"ContainerDied","Data":"b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c"} Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.050118 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b72c3c7a36d00eea852ba616862dd46700300b7af59229ee8fd96d7c87ccc47c" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.050184 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mdmw7" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.271715 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2qh7g"] Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272047 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ff954b-927c-4066-b75d-0e0f4530f888" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272064 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ff954b-927c-4066-b75d-0e0f4530f888" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272083 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37791f6-6d08-4964-9ac8-c9d99f04979c" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272089 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37791f6-6d08-4964-9ac8-c9d99f04979c" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272096 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="dnsmasq-dns" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272101 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="dnsmasq-dns" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272110 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2835ff70-c16c-42dd-8971-f3cfd0ae800f" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272116 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2835ff70-c16c-42dd-8971-f3cfd0ae800f" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272128 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cd0644-db42-42d1-884b-8c341d4eb1c9" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272136 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cd0644-db42-42d1-884b-8c341d4eb1c9" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272165 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9359c136-cac8-46b2-b341-b893187d9476" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272171 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9359c136-cac8-46b2-b341-b893187d9476" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272180 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e120b8a-c3f7-41ec-8987-a8b74198bc74" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272185 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e120b8a-c3f7-41ec-8987-a8b74198bc74" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272191 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="init" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272196 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="init" Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.272205 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e651bca2-bb3f-4946-a656-8900b6c25427" containerName="keystone-db-sync" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272210 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e651bca2-bb3f-4946-a656-8900b6c25427" containerName="keystone-db-sync" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272353 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e120b8a-c3f7-41ec-8987-a8b74198bc74" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272364 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cd0644-db42-42d1-884b-8c341d4eb1c9" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272371 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ff954b-927c-4066-b75d-0e0f4530f888" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272380 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2835ff70-c16c-42dd-8971-f3cfd0ae800f" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272386 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1bfd2f4-8562-470f-b4ec-ac051f8565d1" containerName="dnsmasq-dns" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272395 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e651bca2-bb3f-4946-a656-8900b6c25427" containerName="keystone-db-sync" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272404 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9359c136-cac8-46b2-b341-b893187d9476" containerName="mariadb-account-create-update" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272410 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37791f6-6d08-4964-9ac8-c9d99f04979c" containerName="mariadb-database-create" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.272872 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274047 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274091 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274270 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274319 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274510 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2bd\" (UniqueName: \"kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.274555 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.276597 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2qh7g"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.282566 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.282686 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l8whb" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.282730 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.282790 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.282855 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.283429 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.284486 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.297906 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380556 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380602 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380637 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380711 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380734 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380758 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2bd\" (UniqueName: \"kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380783 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380818 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380848 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbm5r\" (UniqueName: \"kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380878 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380898 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.380942 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.400818 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.406036 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.408573 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.409886 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.410533 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.413941 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2bd\" (UniqueName: \"kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd\") pod \"keystone-bootstrap-2qh7g\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.457056 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.458199 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: W0121 09:18:00.460249 4618 reflector.go:561] object-"openstack"/"horizon-horizon-dockercfg-tmhfw": failed to list *v1.Secret: secrets "horizon-horizon-dockercfg-tmhfw" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Jan 21 09:18:00 crc kubenswrapper[4618]: E0121 09:18:00.460287 4618 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"horizon-horizon-dockercfg-tmhfw\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"horizon-horizon-dockercfg-tmhfw\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.461618 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.461764 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.462903 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.481681 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.481952 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.481988 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csv6b\" (UniqueName: \"kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482027 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482046 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482111 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482127 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482157 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482171 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482211 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.482236 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbm5r\" (UniqueName: \"kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.483176 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.483687 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.484173 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.484658 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.485131 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.497973 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.508508 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vs7mz"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.509382 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.514718 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vdpf6" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.514995 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.532911 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbm5r\" (UniqueName: \"kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r\") pod \"dnsmasq-dns-5fdbfbc95f-qknbg\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.547489 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.547757 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vs7mz"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586116 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csv6b\" (UniqueName: \"kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586175 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586205 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586235 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586282 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586304 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586340 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586371 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586409 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586440 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmc9d\" (UniqueName: \"kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586487 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.586856 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.587797 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.588128 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.590734 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.601694 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.611817 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.618187 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csv6b\" (UniqueName: \"kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b\") pod \"horizon-9f546b547-ct97b\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.645868 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wgbvc"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.656199 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.662264 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2fbfh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.662546 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.674497 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.676096 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.679077 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wgbvc"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.679730 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vstgl" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.680458 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.680589 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.680769 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688338 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688382 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8r9c\" (UniqueName: \"kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688467 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688495 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688524 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688550 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688566 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmc9d\" (UniqueName: \"kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688623 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.688647 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.693778 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.693839 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.700847 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.711933 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.713569 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.716895 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.739097 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.740716 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.751887 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmc9d\" (UniqueName: \"kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d\") pod \"cinder-db-sync-vs7mz\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.756060 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.782662 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.790755 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.790817 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8r9c\" (UniqueName: \"kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792604 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792698 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792748 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792770 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792827 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v94z6\" (UniqueName: \"kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792892 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.792938 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.794124 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.794401 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.794499 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.796063 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.801082 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.802125 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.805221 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jqn8x"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.806063 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.808120 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.808294 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.813077 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.814451 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8r9c\" (UniqueName: \"kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c\") pod \"barbican-db-sync-wgbvc\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.814693 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.814974 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q5dj5" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.816382 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.816599 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.820889 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-q2tdw"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.822935 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.824644 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.824796 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-j9mls" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.824881 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.827212 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.832767 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jqn8x"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.836943 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.837920 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.844936 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q2tdw"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.852477 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.859897 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.875016 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.879207 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.884775 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.885916 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.885930 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.886513 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.896383 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.896438 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.896469 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.896918 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.896975 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897011 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897054 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897084 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897102 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897133 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v94z6\" (UniqueName: \"kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897180 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897216 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897247 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897268 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjb9q\" (UniqueName: \"kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897574 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.897804 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.908946 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.916207 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.916782 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v94z6\" (UniqueName: \"kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.919685 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.929098 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.929984 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.998981 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999358 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999409 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999439 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999461 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999504 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkd6\" (UniqueName: \"kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999523 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999552 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nz4\" (UniqueName: \"kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999580 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999603 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999624 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999650 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999670 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999686 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999705 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gxl\" (UniqueName: \"kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999721 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999738 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999758 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999776 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9l2\" (UniqueName: \"kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999801 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwldc\" (UniqueName: \"kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999843 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999856 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999877 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999894 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999916 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999934 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999952 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999970 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:00 crc kubenswrapper[4618]: I0121 09:18:00.999995 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjb9q\" (UniqueName: \"kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.000025 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.000040 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.000061 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.000109 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.000126 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.002260 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.003616 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.003854 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.005728 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.017257 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjb9q\" (UniqueName: \"kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q\") pod \"horizon-856d56fb7f-qjx5g\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.022292 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.048498 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.095329 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.102532 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.102617 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.102687 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.102784 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.102814 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.104359 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.104848 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.104895 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.104935 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.104990 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkd6\" (UniqueName: \"kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105024 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105090 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nz4\" (UniqueName: \"kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105133 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105180 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105213 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105247 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105269 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105326 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69gxl\" (UniqueName: \"kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105359 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105384 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105419 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9l2\" (UniqueName: \"kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105477 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105518 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwldc\" (UniqueName: \"kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105615 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105635 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105694 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105733 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105743 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105820 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.120258 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.121779 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.122050 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.123091 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.123785 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.123815 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105787 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.123882 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.124027 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105318 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.124470 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.124499 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.105864 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.125735 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.126511 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.127084 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.127610 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.127919 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.129975 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.134868 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nz4\" (UniqueName: \"kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.149857 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkd6\" (UniqueName: \"kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6\") pod \"dnsmasq-dns-6f6f8cb849-47fmh\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.150439 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.152000 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.152612 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle\") pod \"neutron-db-sync-jqn8x\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.156450 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.157310 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwldc\" (UniqueName: \"kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.157745 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gxl\" (UniqueName: \"kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.162395 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts\") pod \"ceilometer-0\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.163013 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9l2\" (UniqueName: \"kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2\") pod \"placement-db-sync-q2tdw\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.167879 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.182361 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.186219 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.189482 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.190781 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.196398 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.216189 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.310723 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2qh7g"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.411404 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vs7mz"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.560078 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wgbvc"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.646929 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:01 crc kubenswrapper[4618]: W0121 09:18:01.665218 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c091558_e8d7_4113_85b5_ff0abc314545.slice/crio-678eb37fd87c66ecae07531bc945223221b538c4e104ccc75a2187d6aa436e45 WatchSource:0}: Error finding container 678eb37fd87c66ecae07531bc945223221b538c4e104ccc75a2187d6aa436e45: Status 404 returned error can't find the container with id 678eb37fd87c66ecae07531bc945223221b538c4e104ccc75a2187d6aa436e45 Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.796479 4618 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/horizon-9f546b547-ct97b" secret="" err="failed to sync secret cache: timed out waiting for the condition" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.796549 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.804648 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jqn8x"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.925884 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.935347 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q2tdw"] Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.943080 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:01 crc kubenswrapper[4618]: W0121 09:18:01.950270 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eb33687_2287_42d0_aa55_a0aadf31dcca.slice/crio-f4ad97bdfcd096624bc1d7458788c98fd64faaf813b5cef598b4075afbcf1f35 WatchSource:0}: Error finding container f4ad97bdfcd096624bc1d7458788c98fd64faaf813b5cef598b4075afbcf1f35: Status 404 returned error can't find the container with id f4ad97bdfcd096624bc1d7458788c98fd64faaf813b5cef598b4075afbcf1f35 Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.951899 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tmhfw" Jan 21 09:18:01 crc kubenswrapper[4618]: I0121 09:18:01.951938 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.098600 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs7mz" event={"ID":"46a5bcfd-6e6c-4070-b7e2-b2e90789f888","Type":"ContainerStarted","Data":"9b29f209a01559e4bc231bf58e6f8c1bf96701266aa69313731d4e58c3739e5b"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.108309 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" event={"ID":"528c0fac-d760-4ab1-8e8f-3edf42c51f40","Type":"ContainerStarted","Data":"096082e3e75397c88b83c7b74ea0d61f6c87c32d78525e4f464515c6e27806ae"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.113058 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" event={"ID":"3bd55665-cf2b-4d2a-8982-03a32da87e7f","Type":"ContainerStarted","Data":"6f3db65f929d838ff57e5a45389af81405550653745ba1f42fd3d7e4740c9675"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.116089 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2qh7g" event={"ID":"b7894c60-e40c-4cbc-8938-660a8c6791b9","Type":"ContainerStarted","Data":"df2f207300ffca94a03031cc7e319bae123ac2fec933336d759db8f7776987a8"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.137083 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jqn8x" event={"ID":"1f0f8544-4e4e-49e4-8eff-43529c9e607b","Type":"ContainerStarted","Data":"42c20c2dcb0f642e8a7fef5cf72d0c6432a831353271fe84bb83b67bee33790f"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.150872 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgbvc" event={"ID":"1f724147-bec3-4df8-8f5d-cb9ff9e128e0","Type":"ContainerStarted","Data":"2545458992bdcdf39c642793c8085c5a06933294a717e0e941e3b87445383567"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.163998 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerStarted","Data":"f4ad97bdfcd096624bc1d7458788c98fd64faaf813b5cef598b4075afbcf1f35"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.167769 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2tdw" event={"ID":"12353eaa-fb43-415a-b590-f69fadbdd4e1","Type":"ContainerStarted","Data":"3ef9577bc38421c4b291dded0ea127ca5a4f40bdc5c0cf28ea527101511b7cc6"} Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.169443 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2qh7g" podStartSLOduration=2.169429945 podStartE2EDuration="2.169429945s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:02.156755428 +0000 UTC m=+880.907222745" watchObservedRunningTime="2026-01-21 09:18:02.169429945 +0000 UTC m=+880.919897261" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.178736 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerStarted","Data":"678eb37fd87c66ecae07531bc945223221b538c4e104ccc75a2187d6aa436e45"} Jan 21 09:18:02 crc kubenswrapper[4618]: W0121 09:18:02.195590 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f4975ba_c37d_42cb_af29_49e7d5e4e2b6.slice/crio-915d286c7e421f6424c68ba943d52bdbf756b5ed403be644da0a482285732f70 WatchSource:0}: Error finding container 915d286c7e421f6424c68ba943d52bdbf756b5ed403be644da0a482285732f70: Status 404 returned error can't find the container with id 915d286c7e421f6424c68ba943d52bdbf756b5ed403be644da0a482285732f70 Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.197738 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.225473 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jqn8x" podStartSLOduration=2.225450068 podStartE2EDuration="2.225450068s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:02.178420091 +0000 UTC m=+880.928887408" watchObservedRunningTime="2026-01-21 09:18:02.225450068 +0000 UTC m=+880.975917385" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.256492 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.445827 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.557523 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.559335 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.586405 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.661046 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.676972 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.677046 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.677215 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.677279 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.677355 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbm5r\" (UniqueName: \"kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.677413 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb\") pod \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\" (UID: \"3bd55665-cf2b-4d2a-8982-03a32da87e7f\") " Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.689200 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:02 crc kubenswrapper[4618]: E0121 09:18:02.689740 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd55665-cf2b-4d2a-8982-03a32da87e7f" containerName="init" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.689756 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd55665-cf2b-4d2a-8982-03a32da87e7f" containerName="init" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.689998 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd55665-cf2b-4d2a-8982-03a32da87e7f" containerName="init" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.691099 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.727188 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.762868 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r" (OuterVolumeSpecName: "kube-api-access-mbm5r") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "kube-api-access-mbm5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784712 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784775 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784855 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784886 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86x8\" (UniqueName: \"kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784924 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.784969 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbm5r\" (UniqueName: \"kubernetes.io/projected/3bd55665-cf2b-4d2a-8982-03a32da87e7f-kube-api-access-mbm5r\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.805077 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.805562 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.830671 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.836826 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.852556 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.872806 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config" (OuterVolumeSpecName: "config") pod "3bd55665-cf2b-4d2a-8982-03a32da87e7f" (UID: "3bd55665-cf2b-4d2a-8982-03a32da87e7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895272 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895334 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895511 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895569 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p86x8\" (UniqueName: \"kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895642 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895706 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895719 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895728 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895736 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.895744 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd55665-cf2b-4d2a-8982-03a32da87e7f-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.896396 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.897302 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.897526 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.903555 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:02 crc kubenswrapper[4618]: I0121 09:18:02.913450 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86x8\" (UniqueName: \"kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8\") pod \"horizon-575d89d597-bq5bt\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.132197 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.209029 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-856d56fb7f-qjx5g" event={"ID":"ba25c1a2-83b3-4f25-bc8a-030749218a5e","Type":"ContainerStarted","Data":"9c45f19f2f72d9ae942eff8c499bc022d87699078a9eab7acef35631605b7209"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.211045 4618 generic.go:334] "Generic (PLEG): container finished" podID="3bd55665-cf2b-4d2a-8982-03a32da87e7f" containerID="3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac" exitCode=0 Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.211117 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" event={"ID":"3bd55665-cf2b-4d2a-8982-03a32da87e7f","Type":"ContainerDied","Data":"3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.211193 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" event={"ID":"3bd55665-cf2b-4d2a-8982-03a32da87e7f","Type":"ContainerDied","Data":"6f3db65f929d838ff57e5a45389af81405550653745ba1f42fd3d7e4740c9675"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.211212 4618 scope.go:117] "RemoveContainer" containerID="3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.211221 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-qknbg" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.214281 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2qh7g" event={"ID":"b7894c60-e40c-4cbc-8938-660a8c6791b9","Type":"ContainerStarted","Data":"a1b897c6d1a799b4df6a433b824601689bfbad2a1b7ee1e32838121e14b0cb39"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.221025 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerStarted","Data":"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.221052 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerStarted","Data":"915d286c7e421f6424c68ba943d52bdbf756b5ed403be644da0a482285732f70"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.222744 4618 generic.go:334] "Generic (PLEG): container finished" podID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerID="27bff08c38c1a8d2f4236396f25924479e28878fb87b359584fce30c74834ea8" exitCode=0 Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.222780 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" event={"ID":"528c0fac-d760-4ab1-8e8f-3edf42c51f40","Type":"ContainerDied","Data":"27bff08c38c1a8d2f4236396f25924479e28878fb87b359584fce30c74834ea8"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.228483 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerStarted","Data":"270c42f144481e2add4117f12cf94d8d3161bad95d093f828794b1d1a23a54a8"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.230134 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerStarted","Data":"90c8266a3bd9931b8c12d03c10bcf62281598d91509d2ec778af9dcbfc911b92"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.232489 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vmx97" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="registry-server" containerID="cri-o://be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9" gracePeriod=2 Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.232897 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jqn8x" event={"ID":"1f0f8544-4e4e-49e4-8eff-43529c9e607b","Type":"ContainerStarted","Data":"42ab8ca3f7c184a911cc0c2f7df404d8872facf10b1646eb88069c91f79c5f48"} Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.294122 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.298514 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-qknbg"] Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.418221 4618 scope.go:117] "RemoveContainer" containerID="3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac" Jan 21 09:18:03 crc kubenswrapper[4618]: E0121 09:18:03.429534 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac\": container with ID starting with 3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac not found: ID does not exist" containerID="3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.429588 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac"} err="failed to get container status \"3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac\": rpc error: code = NotFound desc = could not find container \"3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac\": container with ID starting with 3c4671d62d8e7d14c8b58bedbc2bcc988a95c5e04aa412730d6681794cba12ac not found: ID does not exist" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.583329 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd55665-cf2b-4d2a-8982-03a32da87e7f" path="/var/lib/kubelet/pods/3bd55665-cf2b-4d2a-8982-03a32da87e7f/volumes" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.737480 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:03 crc kubenswrapper[4618]: W0121 09:18:03.794271 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaddf76d0_4280_433e_8002_7a61d78b6f11.slice/crio-84d236deeff9c94d9425cf8c8ad70d3398c2fd4a391a138029bbd476f6340bcc WatchSource:0}: Error finding container 84d236deeff9c94d9425cf8c8ad70d3398c2fd4a391a138029bbd476f6340bcc: Status 404 returned error can't find the container with id 84d236deeff9c94d9425cf8c8ad70d3398c2fd4a391a138029bbd476f6340bcc Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.828831 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.924960 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content\") pod \"af9e4105-9c76-46df-a046-76f2efb55036\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.925297 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities\") pod \"af9e4105-9c76-46df-a046-76f2efb55036\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.925328 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdc5\" (UniqueName: \"kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5\") pod \"af9e4105-9c76-46df-a046-76f2efb55036\" (UID: \"af9e4105-9c76-46df-a046-76f2efb55036\") " Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.926202 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities" (OuterVolumeSpecName: "utilities") pod "af9e4105-9c76-46df-a046-76f2efb55036" (UID: "af9e4105-9c76-46df-a046-76f2efb55036"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.931351 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5" (OuterVolumeSpecName: "kube-api-access-6wdc5") pod "af9e4105-9c76-46df-a046-76f2efb55036" (UID: "af9e4105-9c76-46df-a046-76f2efb55036"). InnerVolumeSpecName "kube-api-access-6wdc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:03 crc kubenswrapper[4618]: I0121 09:18:03.954333 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af9e4105-9c76-46df-a046-76f2efb55036" (UID: "af9e4105-9c76-46df-a046-76f2efb55036"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.027673 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.027710 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9e4105-9c76-46df-a046-76f2efb55036-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.027723 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdc5\" (UniqueName: \"kubernetes.io/projected/af9e4105-9c76-46df-a046-76f2efb55036-kube-api-access-6wdc5\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.258587 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575d89d597-bq5bt" event={"ID":"addf76d0-4280-433e-8002-7a61d78b6f11","Type":"ContainerStarted","Data":"84d236deeff9c94d9425cf8c8ad70d3398c2fd4a391a138029bbd476f6340bcc"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.260012 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerStarted","Data":"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.260192 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-log" containerID="cri-o://b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" gracePeriod=30 Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.260255 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-httpd" containerID="cri-o://f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" gracePeriod=30 Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.264849 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" event={"ID":"528c0fac-d760-4ab1-8e8f-3edf42c51f40","Type":"ContainerStarted","Data":"1cfea8e5527ff192e098e92bd19599ab93b794be05d85421784cf1dcdaf59376"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.265400 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.269448 4618 generic.go:334] "Generic (PLEG): container finished" podID="af9e4105-9c76-46df-a046-76f2efb55036" containerID="be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9" exitCode=0 Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.269518 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerDied","Data":"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.269547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmx97" event={"ID":"af9e4105-9c76-46df-a046-76f2efb55036","Type":"ContainerDied","Data":"cb4025302e64ad7f7563d3da97382f84c31ef7f961bbdec9fb547a6bb4ecb3e4"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.269567 4618 scope.go:117] "RemoveContainer" containerID="be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.269734 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmx97" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.272822 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-log" containerID="cri-o://90c8266a3bd9931b8c12d03c10bcf62281598d91509d2ec778af9dcbfc911b92" gracePeriod=30 Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.273060 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerStarted","Data":"5a50dc9770b62da31110e66afd6eb08ea3e2ecd712f9ca2f6104e92320dee042"} Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.273537 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-httpd" containerID="cri-o://5a50dc9770b62da31110e66afd6eb08ea3e2ecd712f9ca2f6104e92320dee042" gracePeriod=30 Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.303368 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.303348126 podStartE2EDuration="4.303348126s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:04.276499434 +0000 UTC m=+883.026966751" watchObservedRunningTime="2026-01-21 09:18:04.303348126 +0000 UTC m=+883.053815444" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.320211 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.3201798799999995 podStartE2EDuration="4.32017988s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:04.307808145 +0000 UTC m=+883.058275461" watchObservedRunningTime="2026-01-21 09:18:04.32017988 +0000 UTC m=+883.070647197" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.332666 4618 scope.go:117] "RemoveContainer" containerID="29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.334369 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" podStartSLOduration=4.334353727 podStartE2EDuration="4.334353727s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:04.326356046 +0000 UTC m=+883.076823354" watchObservedRunningTime="2026-01-21 09:18:04.334353727 +0000 UTC m=+883.084821044" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.344012 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.348271 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmx97"] Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.412284 4618 scope.go:117] "RemoveContainer" containerID="c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.623286 4618 scope.go:117] "RemoveContainer" containerID="be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9" Jan 21 09:18:04 crc kubenswrapper[4618]: E0121 09:18:04.628568 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9\": container with ID starting with be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9 not found: ID does not exist" containerID="be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.628602 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9"} err="failed to get container status \"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9\": rpc error: code = NotFound desc = could not find container \"be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9\": container with ID starting with be5e4af63d697b29e5087db11976ab7ef5363bccc29103eabeaff48150ac4ef9 not found: ID does not exist" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.628621 4618 scope.go:117] "RemoveContainer" containerID="29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2" Jan 21 09:18:04 crc kubenswrapper[4618]: E0121 09:18:04.629169 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2\": container with ID starting with 29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2 not found: ID does not exist" containerID="29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.629192 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2"} err="failed to get container status \"29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2\": rpc error: code = NotFound desc = could not find container \"29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2\": container with ID starting with 29c1085bf765e62cf29e73a6c71e0669890dfb2fffc3e605883ffe6921ca80b2 not found: ID does not exist" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.629205 4618 scope.go:117] "RemoveContainer" containerID="c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201" Jan 21 09:18:04 crc kubenswrapper[4618]: E0121 09:18:04.630470 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201\": container with ID starting with c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201 not found: ID does not exist" containerID="c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.630493 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201"} err="failed to get container status \"c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201\": rpc error: code = NotFound desc = could not find container \"c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201\": container with ID starting with c46ebc20bdcd05af49e0be8964a074df7c76eab3cd7b1c68f92ec4e4228e0201 not found: ID does not exist" Jan 21 09:18:04 crc kubenswrapper[4618]: I0121 09:18:04.883610 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.045195 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwldc\" (UniqueName: \"kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.045415 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.045518 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.045733 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs" (OuterVolumeSpecName: "logs") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046172 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046277 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046328 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046379 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data\") pod \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\" (UID: \"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6\") " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046458 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046918 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.046936 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.063315 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts" (OuterVolumeSpecName: "scripts") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.063375 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.063793 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc" (OuterVolumeSpecName: "kube-api-access-dwldc") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "kube-api-access-dwldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.076053 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.088185 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data" (OuterVolumeSpecName: "config-data") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.106810 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" (UID: "9f4975ba-c37d-42cb-af29-49e7d5e4e2b6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151339 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151365 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151398 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151411 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151420 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwldc\" (UniqueName: \"kubernetes.io/projected/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-kube-api-access-dwldc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.151437 4618 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.170774 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.256438 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.289804 4618 generic.go:334] "Generic (PLEG): container finished" podID="7c091558-e8d7-4113-85b5-ff0abc314545" containerID="5a50dc9770b62da31110e66afd6eb08ea3e2ecd712f9ca2f6104e92320dee042" exitCode=0 Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.289844 4618 generic.go:334] "Generic (PLEG): container finished" podID="7c091558-e8d7-4113-85b5-ff0abc314545" containerID="90c8266a3bd9931b8c12d03c10bcf62281598d91509d2ec778af9dcbfc911b92" exitCode=143 Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.289890 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerDied","Data":"5a50dc9770b62da31110e66afd6eb08ea3e2ecd712f9ca2f6104e92320dee042"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.289916 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerDied","Data":"90c8266a3bd9931b8c12d03c10bcf62281598d91509d2ec778af9dcbfc911b92"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.294749 4618 generic.go:334] "Generic (PLEG): container finished" podID="b7894c60-e40c-4cbc-8938-660a8c6791b9" containerID="a1b897c6d1a799b4df6a433b824601689bfbad2a1b7ee1e32838121e14b0cb39" exitCode=0 Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.294805 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2qh7g" event={"ID":"b7894c60-e40c-4cbc-8938-660a8c6791b9","Type":"ContainerDied","Data":"a1b897c6d1a799b4df6a433b824601689bfbad2a1b7ee1e32838121e14b0cb39"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299102 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerID="f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" exitCode=0 Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299128 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerID="b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" exitCode=143 Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299165 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerDied","Data":"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299223 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerDied","Data":"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299237 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4975ba-c37d-42cb-af29-49e7d5e4e2b6","Type":"ContainerDied","Data":"915d286c7e421f6424c68ba943d52bdbf756b5ed403be644da0a482285732f70"} Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299254 4618 scope.go:117] "RemoveContainer" containerID="f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.299222 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.336275 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.341891 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.356170 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:05 crc kubenswrapper[4618]: E0121 09:18:05.357342 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="extract-content" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357365 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="extract-content" Jan 21 09:18:05 crc kubenswrapper[4618]: E0121 09:18:05.357383 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="registry-server" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357389 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="registry-server" Jan 21 09:18:05 crc kubenswrapper[4618]: E0121 09:18:05.357403 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-httpd" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357410 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-httpd" Jan 21 09:18:05 crc kubenswrapper[4618]: E0121 09:18:05.357422 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-log" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357438 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-log" Jan 21 09:18:05 crc kubenswrapper[4618]: E0121 09:18:05.357523 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="extract-utilities" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357530 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="extract-utilities" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357707 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-httpd" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357729 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9e4105-9c76-46df-a046-76f2efb55036" containerName="registry-server" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.357740 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" containerName="glance-log" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.358721 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.361008 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.363857 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.384838 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.462758 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.462834 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.462905 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.462949 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.462972 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.463206 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcs5\" (UniqueName: \"kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.463323 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.463822 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.548221 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4975ba-c37d-42cb-af29-49e7d5e4e2b6" path="/var/lib/kubelet/pods/9f4975ba-c37d-42cb-af29-49e7d5e4e2b6/volumes" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.548874 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9e4105-9c76-46df-a046-76f2efb55036" path="/var/lib/kubelet/pods/af9e4105-9c76-46df-a046-76f2efb55036/volumes" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570068 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570166 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570247 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570291 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570310 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570329 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570345 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.570969 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.571183 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.571281 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.571631 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcs5\" (UniqueName: \"kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.577252 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.577319 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.586861 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.591230 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcs5\" (UniqueName: \"kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.593280 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.603961 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:05 crc kubenswrapper[4618]: I0121 09:18:05.752688 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.085522 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.123336 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.124650 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.129208 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.145740 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.161593 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.174513 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.192536 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-87c49d4f8-74x7z"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.195404 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.208332 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87c49d4f8-74x7z"] Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261515 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261578 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-config-data\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261721 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-combined-ca-bundle\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261771 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjcd\" (UniqueName: \"kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261797 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-logs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261832 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcnn7\" (UniqueName: \"kubernetes.io/projected/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-kube-api-access-jcnn7\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.261962 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-scripts\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262006 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262061 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262085 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-tls-certs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262109 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262187 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-secret-key\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.262248 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364552 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364611 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-tls-certs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364639 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364697 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364739 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-secret-key\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364771 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364859 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364881 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-config-data\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364940 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-combined-ca-bundle\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364973 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-logs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.364995 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjcd\" (UniqueName: \"kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.365022 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcnn7\" (UniqueName: \"kubernetes.io/projected/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-kube-api-access-jcnn7\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.365055 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-scripts\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.365080 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.365568 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.366174 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.368053 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-config-data\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.374155 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-logs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.375496 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.376189 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-scripts\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.376760 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.377115 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-tls-certs\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.378213 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-combined-ca-bundle\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.391569 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.392288 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.392646 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjcd\" (UniqueName: \"kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd\") pod \"horizon-7784c76494-zjhpz\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.401025 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-horizon-secret-key\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.430447 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcnn7\" (UniqueName: \"kubernetes.io/projected/d4a5a9b2-1432-43cc-bfe1-58285caf06ea-kube-api-access-jcnn7\") pod \"horizon-87c49d4f8-74x7z\" (UID: \"d4a5a9b2-1432-43cc-bfe1-58285caf06ea\") " pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.448524 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:09 crc kubenswrapper[4618]: I0121 09:18:09.507947 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:10 crc kubenswrapper[4618]: E0121 09:18:10.103725 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:18:10 crc kubenswrapper[4618]: I0121 09:18:10.382241 4618 generic.go:334] "Generic (PLEG): container finished" podID="1f0f8544-4e4e-49e4-8eff-43529c9e607b" containerID="42ab8ca3f7c184a911cc0c2f7df404d8872facf10b1646eb88069c91f79c5f48" exitCode=0 Jan 21 09:18:10 crc kubenswrapper[4618]: I0121 09:18:10.382297 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jqn8x" event={"ID":"1f0f8544-4e4e-49e4-8eff-43529c9e607b","Type":"ContainerDied","Data":"42ab8ca3f7c184a911cc0c2f7df404d8872facf10b1646eb88069c91f79c5f48"} Jan 21 09:18:11 crc kubenswrapper[4618]: I0121 09:18:11.158470 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:11 crc kubenswrapper[4618]: I0121 09:18:11.207445 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:18:11 crc kubenswrapper[4618]: I0121 09:18:11.207678 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" containerID="cri-o://d72d9fd0db4bd4a5fa233b9faa4bcdd0adf4fd93a712b7dbf73f72a108c5a4c4" gracePeriod=10 Jan 21 09:18:11 crc kubenswrapper[4618]: I0121 09:18:11.391973 4618 generic.go:334] "Generic (PLEG): container finished" podID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerID="d72d9fd0db4bd4a5fa233b9faa4bcdd0adf4fd93a712b7dbf73f72a108c5a4c4" exitCode=0 Jan 21 09:18:11 crc kubenswrapper[4618]: I0121 09:18:11.392132 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" event={"ID":"97a18a3f-1479-4331-88b7-ca75b69d1187","Type":"ContainerDied","Data":"d72d9fd0db4bd4a5fa233b9faa4bcdd0adf4fd93a712b7dbf73f72a108c5a4c4"} Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.625386 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.632667 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771494 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771717 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771826 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771856 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771894 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.771967 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2bd\" (UniqueName: \"kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772053 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772154 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772188 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v94z6\" (UniqueName: \"kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772228 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772255 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772289 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772326 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts\") pod \"b7894c60-e40c-4cbc-8938-660a8c6791b9\" (UID: \"b7894c60-e40c-4cbc-8938-660a8c6791b9\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.772411 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle\") pod \"7c091558-e8d7-4113-85b5-ff0abc314545\" (UID: \"7c091558-e8d7-4113-85b5-ff0abc314545\") " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.773505 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs" (OuterVolumeSpecName: "logs") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.773947 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.775018 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.775043 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c091558-e8d7-4113-85b5-ff0abc314545-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.781248 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts" (OuterVolumeSpecName: "scripts") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.781267 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.781890 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.781960 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.782857 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts" (OuterVolumeSpecName: "scripts") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.782887 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6" (OuterVolumeSpecName: "kube-api-access-v94z6") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "kube-api-access-v94z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.797249 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd" (OuterVolumeSpecName: "kube-api-access-dh2bd") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "kube-api-access-dh2bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.803311 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data" (OuterVolumeSpecName: "config-data") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.803908 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.806952 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7894c60-e40c-4cbc-8938-660a8c6791b9" (UID: "b7894c60-e40c-4cbc-8938-660a8c6791b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.817950 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data" (OuterVolumeSpecName: "config-data") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.818384 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c091558-e8d7-4113-85b5-ff0abc314545" (UID: "7c091558-e8d7-4113-85b5-ff0abc314545"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877351 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877379 4618 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877394 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2bd\" (UniqueName: \"kubernetes.io/projected/b7894c60-e40c-4cbc-8938-660a8c6791b9-kube-api-access-dh2bd\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877406 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877435 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877457 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v94z6\" (UniqueName: \"kubernetes.io/projected/7c091558-e8d7-4113-85b5-ff0abc314545-kube-api-access-v94z6\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877487 4618 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877500 4618 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877509 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877517 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7894c60-e40c-4cbc-8938-660a8c6791b9-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877526 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.877534 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c091558-e8d7-4113-85b5-ff0abc314545-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.890763 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 09:18:13 crc kubenswrapper[4618]: I0121 09:18:13.979135 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.420490 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2qh7g" event={"ID":"b7894c60-e40c-4cbc-8938-660a8c6791b9","Type":"ContainerDied","Data":"df2f207300ffca94a03031cc7e319bae123ac2fec933336d759db8f7776987a8"} Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.420525 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2qh7g" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.420537 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df2f207300ffca94a03031cc7e319bae123ac2fec933336d759db8f7776987a8" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.423534 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c091558-e8d7-4113-85b5-ff0abc314545","Type":"ContainerDied","Data":"678eb37fd87c66ecae07531bc945223221b538c4e104ccc75a2187d6aa436e45"} Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.423650 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.470216 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.487225 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.494193 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.495314 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-httpd" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495340 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-httpd" Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.495362 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-log" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495368 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-log" Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.495380 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7894c60-e40c-4cbc-8938-660a8c6791b9" containerName="keystone-bootstrap" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495389 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7894c60-e40c-4cbc-8938-660a8c6791b9" containerName="keystone-bootstrap" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495587 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-httpd" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495599 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" containerName="glance-log" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.495610 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7894c60-e40c-4cbc-8938-660a8c6791b9" containerName="keystone-bootstrap" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.496697 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.498747 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.500351 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.508320 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596196 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596278 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596329 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596375 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596581 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596622 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz64n\" (UniqueName: \"kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596682 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.596738 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701600 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701707 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz64n\" (UniqueName: \"kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701767 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701832 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701911 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701957 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.701994 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.702044 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.702624 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.705096 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.713027 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.713339 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.721329 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.724089 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.724949 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2qh7g"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.727628 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz64n\" (UniqueName: \"kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.728940 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.736302 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2qh7g"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.737673 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.815696 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.830571 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l5lv6"] Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.832870 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.836776 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l8whb" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.836875 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.837117 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.841770 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.841827 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 09:18:14 crc kubenswrapper[4618]: I0121 09:18:14.848793 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l5lv6"] Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.944581 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.945057 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cd9l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-q2tdw_openstack(12353eaa-fb43-415a-b590-f69fadbdd4e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:18:14 crc kubenswrapper[4618]: E0121 09:18:14.946416 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-q2tdw" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.012576 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.012729 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.013170 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.013210 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.013239 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.013283 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4wk\" (UniqueName: \"kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.114455 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.115489 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.115595 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.116093 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.116165 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4wk\" (UniqueName: \"kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.116273 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.119100 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.119740 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.120527 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.120779 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.122350 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.132564 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4wk\" (UniqueName: \"kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk\") pod \"keystone-bootstrap-l5lv6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.151651 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:15 crc kubenswrapper[4618]: E0121 09:18:15.432457 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-q2tdw" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.555803 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c091558-e8d7-4113-85b5-ff0abc314545" path="/var/lib/kubelet/pods/7c091558-e8d7-4113-85b5-ff0abc314545/volumes" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.556820 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Jan 21 09:18:15 crc kubenswrapper[4618]: I0121 09:18:15.556997 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7894c60-e40c-4cbc-8938-660a8c6791b9" path="/var/lib/kubelet/pods/b7894c60-e40c-4cbc-8938-660a8c6791b9/volumes" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.061404 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.061867 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncch65bh658h5bh76h5c8h5c8h64bh8h598h84hcfhbdhdfh95h64fh64ch99h589hdch68h658h5fdh64ch58h5dbh5f4h8fh656hd7h55bh66dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p86x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-575d89d597-bq5bt_openstack(addf76d0-4280-433e-8002-7a61d78b6f11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.064960 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.065190 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n77h59hcfh585h584h9bh657h5fdh558hc8h569hc4hd9h596h66fh664h5f4h57fh5h56ch5fbh95h5d5hdch5chb9hd5h676h59fhdch5d9hb7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjb9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-856d56fb7f-qjx5g_openstack(ba25c1a2-83b3-4f25-bc8a-030749218a5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.065879 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7\\\"\"]" pod="openstack/horizon-575d89d597-bq5bt" podUID="addf76d0-4280-433e-8002-7a61d78b6f11" Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.067753 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7\\\"\"]" pod="openstack/horizon-856d56fb7f-qjx5g" podUID="ba25c1a2-83b3-4f25-bc8a-030749218a5e" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.119492 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.155953 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config\") pod \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.156196 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle\") pod \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.156333 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nz4\" (UniqueName: \"kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4\") pod \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\" (UID: \"1f0f8544-4e4e-49e4-8eff-43529c9e607b\") " Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.160129 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4" (OuterVolumeSpecName: "kube-api-access-48nz4") pod "1f0f8544-4e4e-49e4-8eff-43529c9e607b" (UID: "1f0f8544-4e4e-49e4-8eff-43529c9e607b"). InnerVolumeSpecName "kube-api-access-48nz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.175227 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f0f8544-4e4e-49e4-8eff-43529c9e607b" (UID: "1f0f8544-4e4e-49e4-8eff-43529c9e607b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.175578 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config" (OuterVolumeSpecName: "config") pod "1f0f8544-4e4e-49e4-8eff-43529c9e607b" (UID: "1f0f8544-4e4e-49e4-8eff-43529c9e607b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.259281 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nz4\" (UniqueName: \"kubernetes.io/projected/1f0f8544-4e4e-49e4-8eff-43529c9e607b-kube-api-access-48nz4\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.259320 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.259339 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0f8544-4e4e-49e4-8eff-43529c9e607b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.458691 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:18:17 crc kubenswrapper[4618]: E0121 09:18:17.459574 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0f8544-4e4e-49e4-8eff-43529c9e607b" containerName="neutron-db-sync" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.459602 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0f8544-4e4e-49e4-8eff-43529c9e607b" containerName="neutron-db-sync" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.459817 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0f8544-4e4e-49e4-8eff-43529c9e607b" containerName="neutron-db-sync" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.461475 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.471876 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jqn8x" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.472278 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jqn8x" event={"ID":"1f0f8544-4e4e-49e4-8eff-43529c9e607b","Type":"ContainerDied","Data":"42c20c2dcb0f642e8a7fef5cf72d0c6432a831353271fe84bb83b67bee33790f"} Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.472325 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c20c2dcb0f642e8a7fef5cf72d0c6432a831353271fe84bb83b67bee33790f" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.472344 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.673061 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.673441 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.673739 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5z9c\" (UniqueName: \"kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.775329 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.775424 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.775585 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5z9c\" (UniqueName: \"kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.775875 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.776117 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:17 crc kubenswrapper[4618]: I0121 09:18:17.794538 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5z9c\" (UniqueName: \"kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c\") pod \"redhat-operators-sdwbk\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.087992 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.243357 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.245510 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.256695 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.323434 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.324945 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.327186 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q5dj5" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.327322 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.327545 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.327613 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.338020 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385208 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385245 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385413 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385652 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8th6s\" (UniqueName: \"kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385723 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385804 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2mj\" (UniqueName: \"kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.385945 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.386006 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.386055 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.386131 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.386217 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.487835 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.487871 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.487895 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.487959 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8th6s\" (UniqueName: \"kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.487984 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488010 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2mj\" (UniqueName: \"kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488048 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488078 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488112 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488190 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488231 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488905 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.488966 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.489595 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.489598 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.490506 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.496197 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.496729 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.505742 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.508706 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8th6s\" (UniqueName: \"kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.521035 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2mj\" (UniqueName: \"kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj\") pod \"dnsmasq-dns-685444497c-gxfjk\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.524514 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config\") pod \"neutron-69f66b98c6-zmvmx\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.561975 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:18 crc kubenswrapper[4618]: I0121 09:18:18.646283 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:20 crc kubenswrapper[4618]: E0121 09:18:20.303700 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2858edc9_0823_41b9_9c3a_ca8eecb450fa.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76cf94b1_8904_4389_8ef3_8dd36ea02ecf.slice\": RecentStats: unable to find data in memory cache]" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.448499 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58985598b5-rf45g"] Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.450745 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.454299 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.454310 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.465733 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58985598b5-rf45g"] Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524180 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqj6\" (UniqueName: \"kubernetes.io/projected/dd14fbe1-01de-41f5-9247-d15844d8c697-kube-api-access-jgqj6\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524355 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-httpd-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524446 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-public-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524484 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-internal-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524528 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524578 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-ovndb-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.524647 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-combined-ca-bundle\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.556786 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.626356 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqj6\" (UniqueName: \"kubernetes.io/projected/dd14fbe1-01de-41f5-9247-d15844d8c697-kube-api-access-jgqj6\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.626423 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-httpd-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.626504 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-public-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.627475 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-internal-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.627556 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.627609 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-ovndb-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.627630 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-combined-ca-bundle\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.642660 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-public-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.642726 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqj6\" (UniqueName: \"kubernetes.io/projected/dd14fbe1-01de-41f5-9247-d15844d8c697-kube-api-access-jgqj6\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.643506 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-httpd-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.643824 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-ovndb-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.645940 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-internal-tls-certs\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.646004 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-config\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.650166 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14fbe1-01de-41f5-9247-d15844d8c697-combined-ca-bundle\") pod \"neutron-58985598b5-rf45g\" (UID: \"dd14fbe1-01de-41f5-9247-d15844d8c697\") " pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:20 crc kubenswrapper[4618]: I0121 09:18:20.771001 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.343833 4618 scope.go:117] "RemoveContainer" containerID="b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.419906 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.424847 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497031 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data\") pod \"addf76d0-4280-433e-8002-7a61d78b6f11\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497096 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key\") pod \"addf76d0-4280-433e-8002-7a61d78b6f11\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497130 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts\") pod \"addf76d0-4280-433e-8002-7a61d78b6f11\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497172 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p86x8\" (UniqueName: \"kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8\") pod \"addf76d0-4280-433e-8002-7a61d78b6f11\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497286 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts\") pod \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497364 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs\") pod \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497695 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs" (OuterVolumeSpecName: "logs") pod "ba25c1a2-83b3-4f25-bc8a-030749218a5e" (UID: "ba25c1a2-83b3-4f25-bc8a-030749218a5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.497946 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts" (OuterVolumeSpecName: "scripts") pod "ba25c1a2-83b3-4f25-bc8a-030749218a5e" (UID: "ba25c1a2-83b3-4f25-bc8a-030749218a5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498004 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts" (OuterVolumeSpecName: "scripts") pod "addf76d0-4280-433e-8002-7a61d78b6f11" (UID: "addf76d0-4280-433e-8002-7a61d78b6f11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498025 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs\") pod \"addf76d0-4280-433e-8002-7a61d78b6f11\" (UID: \"addf76d0-4280-433e-8002-7a61d78b6f11\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498115 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjb9q\" (UniqueName: \"kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q\") pod \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498222 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data\") pod \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498288 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key\") pod \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\" (UID: \"ba25c1a2-83b3-4f25-bc8a-030749218a5e\") " Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498765 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data" (OuterVolumeSpecName: "config-data") pod "ba25c1a2-83b3-4f25-bc8a-030749218a5e" (UID: "ba25c1a2-83b3-4f25-bc8a-030749218a5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498820 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498836 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498846 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25c1a2-83b3-4f25-bc8a-030749218a5e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498868 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs" (OuterVolumeSpecName: "logs") pod "addf76d0-4280-433e-8002-7a61d78b6f11" (UID: "addf76d0-4280-433e-8002-7a61d78b6f11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.498894 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data" (OuterVolumeSpecName: "config-data") pod "addf76d0-4280-433e-8002-7a61d78b6f11" (UID: "addf76d0-4280-433e-8002-7a61d78b6f11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.504793 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8" (OuterVolumeSpecName: "kube-api-access-p86x8") pod "addf76d0-4280-433e-8002-7a61d78b6f11" (UID: "addf76d0-4280-433e-8002-7a61d78b6f11"). InnerVolumeSpecName "kube-api-access-p86x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.511322 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ba25c1a2-83b3-4f25-bc8a-030749218a5e" (UID: "ba25c1a2-83b3-4f25-bc8a-030749218a5e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.513251 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "addf76d0-4280-433e-8002-7a61d78b6f11" (UID: "addf76d0-4280-433e-8002-7a61d78b6f11"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.515377 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q" (OuterVolumeSpecName: "kube-api-access-zjb9q") pod "ba25c1a2-83b3-4f25-bc8a-030749218a5e" (UID: "ba25c1a2-83b3-4f25-bc8a-030749218a5e"). InnerVolumeSpecName "kube-api-access-zjb9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.537931 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575d89d597-bq5bt" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.538286 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575d89d597-bq5bt" event={"ID":"addf76d0-4280-433e-8002-7a61d78b6f11","Type":"ContainerDied","Data":"84d236deeff9c94d9425cf8c8ad70d3398c2fd4a391a138029bbd476f6340bcc"} Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.543213 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-856d56fb7f-qjx5g" event={"ID":"ba25c1a2-83b3-4f25-bc8a-030749218a5e","Type":"ContainerDied","Data":"9c45f19f2f72d9ae942eff8c499bc022d87699078a9eab7acef35631605b7209"} Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.543258 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-856d56fb7f-qjx5g" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600159 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/addf76d0-4280-433e-8002-7a61d78b6f11-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600196 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjb9q\" (UniqueName: \"kubernetes.io/projected/ba25c1a2-83b3-4f25-bc8a-030749218a5e-kube-api-access-zjb9q\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600206 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba25c1a2-83b3-4f25-bc8a-030749218a5e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600217 4618 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba25c1a2-83b3-4f25-bc8a-030749218a5e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600227 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/addf76d0-4280-433e-8002-7a61d78b6f11-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600236 4618 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/addf76d0-4280-433e-8002-7a61d78b6f11-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.600245 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p86x8\" (UniqueName: \"kubernetes.io/projected/addf76d0-4280-433e-8002-7a61d78b6f11-kube-api-access-p86x8\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.609224 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.617224 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-575d89d597-bq5bt"] Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.634404 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:24 crc kubenswrapper[4618]: I0121 09:18:24.638937 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-856d56fb7f-qjx5g"] Jan 21 09:18:24 crc kubenswrapper[4618]: E0121 09:18:24.776576 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 21 09:18:24 crc kubenswrapper[4618]: E0121 09:18:24.776747 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8r9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wgbvc_openstack(1f724147-bec3-4df8-8f5d-cb9ff9e128e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:18:24 crc kubenswrapper[4618]: E0121 09:18:24.777949 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wgbvc" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.546469 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addf76d0-4280-433e-8002-7a61d78b6f11" path="/var/lib/kubelet/pods/addf76d0-4280-433e-8002-7a61d78b6f11/volumes" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.547109 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba25c1a2-83b3-4f25-bc8a-030749218a5e" path="/var/lib/kubelet/pods/ba25c1a2-83b3-4f25-bc8a-030749218a5e/volumes" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.554259 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-wgbvc" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.736724 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.736876 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmc9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vs7mz_openstack(46a5bcfd-6e6c-4070-b7e2-b2e90789f888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.738080 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vs7mz" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.811662 4618 scope.go:117] "RemoveContainer" containerID="f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.832923 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296\": container with ID starting with f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296 not found: ID does not exist" containerID="f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.832975 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296"} err="failed to get container status \"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296\": rpc error: code = NotFound desc = could not find container \"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296\": container with ID starting with f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296 not found: ID does not exist" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.833005 4618 scope.go:117] "RemoveContainer" containerID="b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" Jan 21 09:18:25 crc kubenswrapper[4618]: E0121 09:18:25.833454 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6\": container with ID starting with b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6 not found: ID does not exist" containerID="b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.833493 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6"} err="failed to get container status \"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6\": rpc error: code = NotFound desc = could not find container \"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6\": container with ID starting with b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6 not found: ID does not exist" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.833510 4618 scope.go:117] "RemoveContainer" containerID="f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.848611 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296"} err="failed to get container status \"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296\": rpc error: code = NotFound desc = could not find container \"f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296\": container with ID starting with f1dcb206dc890288599b0b9c5b613e31445440ca6214909a25b25ae1633f6296 not found: ID does not exist" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.848636 4618 scope.go:117] "RemoveContainer" containerID="b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.848992 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6"} err="failed to get container status \"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6\": rpc error: code = NotFound desc = could not find container \"b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6\": container with ID starting with b9d1c827e60e8b92bc16aad66b837e04a46e03958ee256ca4d07f0102b6d10a6 not found: ID does not exist" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.849009 4618 scope.go:117] "RemoveContainer" containerID="5a50dc9770b62da31110e66afd6eb08ea3e2ecd712f9ca2f6104e92320dee042" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.908499 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.927751 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8spjk\" (UniqueName: \"kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.927830 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.927872 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.927913 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.927969 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.928043 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:25 crc kubenswrapper[4618]: I0121 09:18:25.951847 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk" (OuterVolumeSpecName: "kube-api-access-8spjk") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "kube-api-access-8spjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.016133 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config" (OuterVolumeSpecName: "config") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.029824 4618 scope.go:117] "RemoveContainer" containerID="90c8266a3bd9931b8c12d03c10bcf62281598d91509d2ec778af9dcbfc911b92" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.029940 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.030917 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031045 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") pod \"97a18a3f-1479-4331-88b7-ca75b69d1187\" (UID: \"97a18a3f-1479-4331-88b7-ca75b69d1187\") " Jan 21 09:18:26 crc kubenswrapper[4618]: W0121 09:18:26.031296 4618 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/97a18a3f-1479-4331-88b7-ca75b69d1187/volumes/kubernetes.io~configmap/ovsdbserver-nb Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031313 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031751 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031774 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031784 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8spjk\" (UniqueName: \"kubernetes.io/projected/97a18a3f-1479-4331-88b7-ca75b69d1187-kube-api-access-8spjk\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.031800 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.037053 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.039567 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97a18a3f-1479-4331-88b7-ca75b69d1187" (UID: "97a18a3f-1479-4331-88b7-ca75b69d1187"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.141264 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.141293 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97a18a3f-1479-4331-88b7-ca75b69d1187-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.156326 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87c49d4f8-74x7z"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.578057 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerStarted","Data":"ff15d193ebe18d38e17a22e7ad3bef679e52c9d89e412a90ade1da5b3d0e1a62"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.585531 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" event={"ID":"97a18a3f-1479-4331-88b7-ca75b69d1187","Type":"ContainerDied","Data":"cb67e3bf0c56fc86c31c26c08a2ae96db8259f125e7e58f62d11c0eb04c910a6"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.585783 4618 scope.go:117] "RemoveContainer" containerID="d72d9fd0db4bd4a5fa233b9faa4bcdd0adf4fd93a712b7dbf73f72a108c5a4c4" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.585548 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.593123 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87c49d4f8-74x7z" event={"ID":"d4a5a9b2-1432-43cc-bfe1-58285caf06ea","Type":"ContainerStarted","Data":"3587a681640a2a7026835e17989585f0c1eb3aa21c848e4db401bddf0777b402"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.593168 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87c49d4f8-74x7z" event={"ID":"d4a5a9b2-1432-43cc-bfe1-58285caf06ea","Type":"ContainerStarted","Data":"986304ef1c1b116f36aa7144de6e5cb6d5c63032b069b265d2dee82eba9b8c7c"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.601907 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9f546b547-ct97b" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon-log" containerID="cri-o://c3ee741a332d0d27e315f8c53be9b88fa52fc79bc463094f0be8a9a5b958987c" gracePeriod=30 Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.602127 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerStarted","Data":"254815b64f63b11b08029d4b7856eadb7d21d7ec68730bdabed22c3ab54370a8"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.602448 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerStarted","Data":"c3ee741a332d0d27e315f8c53be9b88fa52fc79bc463094f0be8a9a5b958987c"} Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.602311 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9f546b547-ct97b" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon" containerID="cri-o://254815b64f63b11b08029d4b7856eadb7d21d7ec68730bdabed22c3ab54370a8" gracePeriod=30 Jan 21 09:18:26 crc kubenswrapper[4618]: E0121 09:18:26.602839 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-vs7mz" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.625550 4618 scope.go:117] "RemoveContainer" containerID="3e267a047f1b12fc59aec5c16292a075e33b49ab2af6ea0ee7d58412fc8f894e" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.634454 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9f546b547-ct97b" podStartSLOduration=4.103731018 podStartE2EDuration="26.634415076s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="2026-01-21 09:18:02.269659019 +0000 UTC m=+881.020126336" lastFinishedPulling="2026-01-21 09:18:24.800343077 +0000 UTC m=+903.550810394" observedRunningTime="2026-01-21 09:18:26.622544101 +0000 UTC m=+905.373011419" watchObservedRunningTime="2026-01-21 09:18:26.634415076 +0000 UTC m=+905.384882394" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.653431 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.667587 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.674186 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-mvdfm"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.677089 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.748186 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.831369 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l5lv6"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.858781 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.918556 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.936162 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 09:18:26 crc kubenswrapper[4618]: I0121 09:18:26.956943 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58985598b5-rf45g"] Jan 21 09:18:26 crc kubenswrapper[4618]: W0121 09:18:26.972088 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd14fbe1_01de_41f5_9247_d15844d8c697.slice/crio-fc47e682de44a7de06e8319b9601da26665d8f1cdaf54bc61522ec0f43de0f0f WatchSource:0}: Error finding container fc47e682de44a7de06e8319b9601da26665d8f1cdaf54bc61522ec0f43de0f0f: Status 404 returned error can't find the container with id fc47e682de44a7de06e8319b9601da26665d8f1cdaf54bc61522ec0f43de0f0f Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.571643 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" path="/var/lib/kubelet/pods/97a18a3f-1479-4331-88b7-ca75b69d1187/volumes" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.613776 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87c49d4f8-74x7z" event={"ID":"d4a5a9b2-1432-43cc-bfe1-58285caf06ea","Type":"ContainerStarted","Data":"a3ee5f6d7c4d137cdbb04e0a9bcad609b770b5430cbb31fe7a4783c16fb079de"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.618241 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerStarted","Data":"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.618298 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerStarted","Data":"261530b56cb4d98e7bb683600db0ee899b70032c7097223f85d497c0ae02046d"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.630702 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerStarted","Data":"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.630737 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerStarted","Data":"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.630748 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerStarted","Data":"ff270c36f1c149aba25c0e947bee74d8d5a7a61be871551f3b9ec9d234a663fc"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.638555 4618 generic.go:334] "Generic (PLEG): container finished" podID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerID="3fb8a61031f1cac97760e25ad51f629e6a8d14a1fd08eec903066ea1cfd7ec94" exitCode=0 Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.638732 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-gxfjk" event={"ID":"3f873699-80a9-4e80-93fc-7c99e5cd5e69","Type":"ContainerDied","Data":"3fb8a61031f1cac97760e25ad51f629e6a8d14a1fd08eec903066ea1cfd7ec94"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.638755 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-gxfjk" event={"ID":"3f873699-80a9-4e80-93fc-7c99e5cd5e69","Type":"ContainerStarted","Data":"bfc57341a3870e1c06026b9b03a9a323dd388bb5403bb4f4af7ebfbfed0da7bf"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.642485 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-87c49d4f8-74x7z" podStartSLOduration=18.642458893 podStartE2EDuration="18.642458893s" podCreationTimestamp="2026-01-21 09:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:27.626937063 +0000 UTC m=+906.377404380" watchObservedRunningTime="2026-01-21 09:18:27.642458893 +0000 UTC m=+906.392926211" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.661572 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58985598b5-rf45g" event={"ID":"dd14fbe1-01de-41f5-9247-d15844d8c697","Type":"ContainerStarted","Data":"8ff7660f688692f43f9a9b9a465fec5a6276cd2648ca24e0157fd25354c831fc"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.661892 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58985598b5-rf45g" event={"ID":"dd14fbe1-01de-41f5-9247-d15844d8c697","Type":"ContainerStarted","Data":"9833eed5fbe543076b1a12e2445fd11952abb956a7e5fce5578904f13f7f3b1a"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.661908 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58985598b5-rf45g" event={"ID":"dd14fbe1-01de-41f5-9247-d15844d8c697","Type":"ContainerStarted","Data":"fc47e682de44a7de06e8319b9601da26665d8f1cdaf54bc61522ec0f43de0f0f"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.662967 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.677493 4618 generic.go:334] "Generic (PLEG): container finished" podID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerID="5cc4bf939c84d0a22200fdfb0482fd73589dcd02a395b7df41af44905590b5cb" exitCode=0 Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.677558 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerDied","Data":"5cc4bf939c84d0a22200fdfb0482fd73589dcd02a395b7df41af44905590b5cb"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.677582 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerStarted","Data":"aa5dbcaca625a69bacdaaa6bd89b152053cff05e906cd3676aa7b40e12f5e61a"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.682106 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7784c76494-zjhpz" podStartSLOduration=18.682092781 podStartE2EDuration="18.682092781s" podCreationTimestamp="2026-01-21 09:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:27.670640082 +0000 UTC m=+906.421107389" watchObservedRunningTime="2026-01-21 09:18:27.682092781 +0000 UTC m=+906.432560098" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.708236 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58985598b5-rf45g" podStartSLOduration=7.708223052 podStartE2EDuration="7.708223052s" podCreationTimestamp="2026-01-21 09:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:27.691871712 +0000 UTC m=+906.442339028" watchObservedRunningTime="2026-01-21 09:18:27.708223052 +0000 UTC m=+906.458690369" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.709342 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerStarted","Data":"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.709375 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerStarted","Data":"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.709385 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerStarted","Data":"f8ea2c75a8552bd2d6bd9dbc3fb31d0e4c4d110c23118aaa6b014ebca533d7c0"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.709502 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.734771 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l5lv6" event={"ID":"8d13246f-0095-4316-9769-2173765b9ae6","Type":"ContainerStarted","Data":"b4903311de2e0d5517b3770d98e9bfaad26bd69150b6b3dd9d488ddf1d2a52df"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.734824 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l5lv6" event={"ID":"8d13246f-0095-4316-9769-2173765b9ae6","Type":"ContainerStarted","Data":"faeb3834b824abeffe0dcec0a0781526b95dc68fdb5ffd95d7a1864a68d4f89f"} Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.736626 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69f66b98c6-zmvmx" podStartSLOduration=9.736610469 podStartE2EDuration="9.736610469s" podCreationTimestamp="2026-01-21 09:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:27.735087114 +0000 UTC m=+906.485554431" watchObservedRunningTime="2026-01-21 09:18:27.736610469 +0000 UTC m=+906.487077787" Jan 21 09:18:27 crc kubenswrapper[4618]: I0121 09:18:27.776373 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:18:28 crc kubenswrapper[4618]: W0121 09:18:28.265998 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf21bdd0_078c_45ea_9027_7c9c70f53513.slice/crio-da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453 WatchSource:0}: Error finding container da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453: Status 404 returned error can't find the container with id da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453 Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.751310 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerStarted","Data":"da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453"} Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.775964 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-log" containerID="cri-o://971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" gracePeriod=30 Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.776342 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-httpd" containerID="cri-o://b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" gracePeriod=30 Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.776427 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerStarted","Data":"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca"} Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.827316 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l5lv6" podStartSLOduration=14.827292489 podStartE2EDuration="14.827292489s" podCreationTimestamp="2026-01-21 09:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:27.781536368 +0000 UTC m=+906.532003686" watchObservedRunningTime="2026-01-21 09:18:28.827292489 +0000 UTC m=+907.577759806" Jan 21 09:18:28 crc kubenswrapper[4618]: I0121 09:18:28.827646 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.827642007 podStartE2EDuration="23.827642007s" podCreationTimestamp="2026-01-21 09:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:28.820128777 +0000 UTC m=+907.570596095" watchObservedRunningTime="2026-01-21 09:18:28.827642007 +0000 UTC m=+907.578109323" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.450241 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.450645 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.508502 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.508581 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.613286 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.668794 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.668983 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669022 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669044 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpcs5\" (UniqueName: \"kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669100 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669227 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669247 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.669271 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.678593 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs" (OuterVolumeSpecName: "logs") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.678842 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.680524 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5" (OuterVolumeSpecName: "kube-api-access-lpcs5") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "kube-api-access-lpcs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.726225 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts" (OuterVolumeSpecName: "scripts") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.731433 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.763553 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.773425 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data" (OuterVolumeSpecName: "config-data") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774038 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") pod \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\" (UID: \"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d\") " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774581 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774603 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774613 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774623 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774632 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpcs5\" (UniqueName: \"kubernetes.io/projected/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-kube-api-access-lpcs5\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774643 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: W0121 09:18:29.774885 4618 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d/volumes/kubernetes.io~secret/config-data Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.774898 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data" (OuterVolumeSpecName: "config-data") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.792338 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.815550 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" (UID: "d21dfc61-08a8-4bcc-8a33-c3bd7a96612d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.819517 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerStarted","Data":"1c071816e027744ad70de7f9ccd11e6896cb6217d495383eef6f66db52694e42"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.824539 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2tdw" event={"ID":"12353eaa-fb43-415a-b590-f69fadbdd4e1","Type":"ContainerStarted","Data":"31e4d95e7e1ccdff0b91bc94a4b7eb6d8ccb2a874e1197bcf9ba5a422bfbe3b1"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.847993 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-q2tdw" podStartSLOduration=3.127903968 podStartE2EDuration="29.847976078s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="2026-01-21 09:18:01.970302017 +0000 UTC m=+880.720769333" lastFinishedPulling="2026-01-21 09:18:28.690374125 +0000 UTC m=+907.440841443" observedRunningTime="2026-01-21 09:18:29.846436733 +0000 UTC m=+908.596904050" watchObservedRunningTime="2026-01-21 09:18:29.847976078 +0000 UTC m=+908.598443395" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850107 4618 generic.go:334] "Generic (PLEG): container finished" podID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerID="b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" exitCode=0 Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850161 4618 generic.go:334] "Generic (PLEG): container finished" podID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerID="971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" exitCode=143 Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850215 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerDied","Data":"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850252 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerDied","Data":"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850262 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d21dfc61-08a8-4bcc-8a33-c3bd7a96612d","Type":"ContainerDied","Data":"261530b56cb4d98e7bb683600db0ee899b70032c7097223f85d497c0ae02046d"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850279 4618 scope.go:117] "RemoveContainer" containerID="b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.850426 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.854993 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-gxfjk" event={"ID":"3f873699-80a9-4e80-93fc-7c99e5cd5e69","Type":"ContainerStarted","Data":"479cdd8d27d04a95f05ec290ac3c82fa6e012e0fa248631ecb0cf311e78bc141"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.855312 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.857292 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerStarted","Data":"aaa108acefc96795f5d138b3c0cc4cc548eae8a0d621bf454f3de5cc2b41820d"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.863422 4618 generic.go:334] "Generic (PLEG): container finished" podID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerID="aaf3a6579f555c617b640a91a299b26611326023d8ce349fe689f0dfe10c0299" exitCode=0 Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.864532 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerDied","Data":"aaf3a6579f555c617b640a91a299b26611326023d8ce349fe689f0dfe10c0299"} Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.875691 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.875711 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.875724 4618 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.878687 4618 scope.go:117] "RemoveContainer" containerID="971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.888408 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685444497c-gxfjk" podStartSLOduration=11.888398758 podStartE2EDuration="11.888398758s" podCreationTimestamp="2026-01-21 09:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:29.880843369 +0000 UTC m=+908.631310686" watchObservedRunningTime="2026-01-21 09:18:29.888398758 +0000 UTC m=+908.638866065" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.909918 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.920750 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.926713 4618 scope.go:117] "RemoveContainer" containerID="b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.926808 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.927246 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927264 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.927298 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="init" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927304 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="init" Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.927319 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-log" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927326 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-log" Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.927337 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-httpd" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927344 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-httpd" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927532 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-log" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927561 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" containerName="glance-httpd" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.927586 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.928518 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.933348 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.933522 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.934466 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca\": container with ID starting with b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca not found: ID does not exist" containerID="b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.934509 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca"} err="failed to get container status \"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca\": rpc error: code = NotFound desc = could not find container \"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca\": container with ID starting with b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca not found: ID does not exist" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.934535 4618 scope.go:117] "RemoveContainer" containerID="971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" Jan 21 09:18:29 crc kubenswrapper[4618]: E0121 09:18:29.934918 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01\": container with ID starting with 971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01 not found: ID does not exist" containerID="971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.934940 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01"} err="failed to get container status \"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01\": rpc error: code = NotFound desc = could not find container \"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01\": container with ID starting with 971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01 not found: ID does not exist" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.934990 4618 scope.go:117] "RemoveContainer" containerID="b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.935216 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca"} err="failed to get container status \"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca\": rpc error: code = NotFound desc = could not find container \"b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca\": container with ID starting with b3e6284894768fb443247073526613a2910c991436ec62e7e9eca3729ddb38ca not found: ID does not exist" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.935236 4618 scope.go:117] "RemoveContainer" containerID="971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.935404 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01"} err="failed to get container status \"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01\": rpc error: code = NotFound desc = could not find container \"971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01\": container with ID starting with 971af8e92afe76204441343daff76a54d035bad569862b9b478de8f1a58cea01 not found: ID does not exist" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.953859 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.979489 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.979555 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.979587 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.979815 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.979888 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wlp9\" (UniqueName: \"kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.980084 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.980187 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:29 crc kubenswrapper[4618]: I0121 09:18:29.980215 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.082248 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.082335 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.082363 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.082682 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.082938 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.083242 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.083270 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.083389 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.083441 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wlp9\" (UniqueName: \"kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.083561 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.084321 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.087830 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.089164 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.090628 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.091668 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.098850 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wlp9\" (UniqueName: \"kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.108785 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.245481 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.556529 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-mvdfm" podUID="97a18a3f-1479-4331-88b7-ca75b69d1187" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.809820 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.906898 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerStarted","Data":"31c5723af720f3fceb3e1a8ddcbef8a950e7f6f788a7959ebf25a85b567766b5"} Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.914528 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerStarted","Data":"53b10d67a60fc5d18ff2f8559caaa2cba98240b0cb3fc37afa5b461f8b8c9ebf"} Jan 21 09:18:30 crc kubenswrapper[4618]: I0121 09:18:30.941955 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.941883253 podStartE2EDuration="16.941883253s" podCreationTimestamp="2026-01-21 09:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:30.929780362 +0000 UTC m=+909.680247679" watchObservedRunningTime="2026-01-21 09:18:30.941883253 +0000 UTC m=+909.692350570" Jan 21 09:18:31 crc kubenswrapper[4618]: I0121 09:18:31.548395 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d21dfc61-08a8-4bcc-8a33-c3bd7a96612d" path="/var/lib/kubelet/pods/d21dfc61-08a8-4bcc-8a33-c3bd7a96612d/volumes" Jan 21 09:18:31 crc kubenswrapper[4618]: I0121 09:18:31.798327 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:31 crc kubenswrapper[4618]: I0121 09:18:31.926026 4618 generic.go:334] "Generic (PLEG): container finished" podID="8d13246f-0095-4316-9769-2173765b9ae6" containerID="b4903311de2e0d5517b3770d98e9bfaad26bd69150b6b3dd9d488ddf1d2a52df" exitCode=0 Jan 21 09:18:31 crc kubenswrapper[4618]: I0121 09:18:31.926082 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l5lv6" event={"ID":"8d13246f-0095-4316-9769-2173765b9ae6","Type":"ContainerDied","Data":"b4903311de2e0d5517b3770d98e9bfaad26bd69150b6b3dd9d488ddf1d2a52df"} Jan 21 09:18:31 crc kubenswrapper[4618]: I0121 09:18:31.928846 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerStarted","Data":"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303"} Jan 21 09:18:32 crc kubenswrapper[4618]: I0121 09:18:32.950987 4618 generic.go:334] "Generic (PLEG): container finished" podID="12353eaa-fb43-415a-b590-f69fadbdd4e1" containerID="31e4d95e7e1ccdff0b91bc94a4b7eb6d8ccb2a874e1197bcf9ba5a422bfbe3b1" exitCode=0 Jan 21 09:18:32 crc kubenswrapper[4618]: I0121 09:18:32.951242 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2tdw" event={"ID":"12353eaa-fb43-415a-b590-f69fadbdd4e1","Type":"ContainerDied","Data":"31e4d95e7e1ccdff0b91bc94a4b7eb6d8ccb2a874e1197bcf9ba5a422bfbe3b1"} Jan 21 09:18:33 crc kubenswrapper[4618]: I0121 09:18:33.566352 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:18:33 crc kubenswrapper[4618]: I0121 09:18:33.621606 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:33 crc kubenswrapper[4618]: I0121 09:18:33.622018 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="dnsmasq-dns" containerID="cri-o://1cfea8e5527ff192e098e92bd19599ab93b794be05d85421784cf1dcdaf59376" gracePeriod=10 Jan 21 09:18:33 crc kubenswrapper[4618]: I0121 09:18:33.969450 4618 generic.go:334] "Generic (PLEG): container finished" podID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerID="1cfea8e5527ff192e098e92bd19599ab93b794be05d85421784cf1dcdaf59376" exitCode=0 Jan 21 09:18:33 crc kubenswrapper[4618]: I0121 09:18:33.969852 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" event={"ID":"528c0fac-d760-4ab1-8e8f-3edf42c51f40","Type":"ContainerDied","Data":"1cfea8e5527ff192e098e92bd19599ab93b794be05d85421784cf1dcdaf59376"} Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.817441 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.817506 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.868109 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.876711 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.979722 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 09:18:34 crc kubenswrapper[4618]: I0121 09:18:34.980383 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 09:18:36 crc kubenswrapper[4618]: I0121 09:18:36.158412 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Jan 21 09:18:36 crc kubenswrapper[4618]: I0121 09:18:36.714627 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.002353 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2tdw" event={"ID":"12353eaa-fb43-415a-b590-f69fadbdd4e1","Type":"ContainerDied","Data":"3ef9577bc38421c4b291dded0ea127ca5a4f40bdc5c0cf28ea527101511b7cc6"} Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.002676 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ef9577bc38421c4b291dded0ea127ca5a4f40bdc5c0cf28ea527101511b7cc6" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.004706 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l5lv6" event={"ID":"8d13246f-0095-4316-9769-2173765b9ae6","Type":"ContainerDied","Data":"faeb3834b824abeffe0dcec0a0781526b95dc68fdb5ffd95d7a1864a68d4f89f"} Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.004922 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faeb3834b824abeffe0dcec0a0781526b95dc68fdb5ffd95d7a1864a68d4f89f" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.055976 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.085609 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.135830 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.135942 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.136032 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.136178 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4wk\" (UniqueName: \"kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.136272 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.136298 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle\") pod \"8d13246f-0095-4316-9769-2173765b9ae6\" (UID: \"8d13246f-0095-4316-9769-2173765b9ae6\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.146268 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts" (OuterVolumeSpecName: "scripts") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.150571 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.155233 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.155507 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk" (OuterVolumeSpecName: "kube-api-access-rg4wk") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "kube-api-access-rg4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.191123 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.241532 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts\") pod \"12353eaa-fb43-415a-b590-f69fadbdd4e1\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.241619 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle\") pod \"12353eaa-fb43-415a-b590-f69fadbdd4e1\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.241698 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data\") pod \"12353eaa-fb43-415a-b590-f69fadbdd4e1\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.241763 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9l2\" (UniqueName: \"kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2\") pod \"12353eaa-fb43-415a-b590-f69fadbdd4e1\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.241796 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs\") pod \"12353eaa-fb43-415a-b590-f69fadbdd4e1\" (UID: \"12353eaa-fb43-415a-b590-f69fadbdd4e1\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.243962 4618 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.243996 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.244001 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.244009 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4wk\" (UniqueName: \"kubernetes.io/projected/8d13246f-0095-4316-9769-2173765b9ae6-kube-api-access-rg4wk\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.244275 4618 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.245621 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs" (OuterVolumeSpecName: "logs") pod "12353eaa-fb43-415a-b590-f69fadbdd4e1" (UID: "12353eaa-fb43-415a-b590-f69fadbdd4e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.258397 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts" (OuterVolumeSpecName: "scripts") pod "12353eaa-fb43-415a-b590-f69fadbdd4e1" (UID: "12353eaa-fb43-415a-b590-f69fadbdd4e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.260287 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2" (OuterVolumeSpecName: "kube-api-access-cd9l2") pod "12353eaa-fb43-415a-b590-f69fadbdd4e1" (UID: "12353eaa-fb43-415a-b590-f69fadbdd4e1"). InnerVolumeSpecName "kube-api-access-cd9l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.260298 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data" (OuterVolumeSpecName: "config-data") pod "8d13246f-0095-4316-9769-2173765b9ae6" (UID: "8d13246f-0095-4316-9769-2173765b9ae6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.273666 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12353eaa-fb43-415a-b590-f69fadbdd4e1" (UID: "12353eaa-fb43-415a-b590-f69fadbdd4e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.288318 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data" (OuterVolumeSpecName: "config-data") pod "12353eaa-fb43-415a-b590-f69fadbdd4e1" (UID: "12353eaa-fb43-415a-b590-f69fadbdd4e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.344917 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.345253 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.345334 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.345433 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.345507 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.345527 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkd6\" (UniqueName: \"kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6\") pod \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\" (UID: \"528c0fac-d760-4ab1-8e8f-3edf42c51f40\") " Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346056 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346073 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346084 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d13246f-0095-4316-9769-2173765b9ae6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346093 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346102 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12353eaa-fb43-415a-b590-f69fadbdd4e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346110 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd9l2\" (UniqueName: \"kubernetes.io/projected/12353eaa-fb43-415a-b590-f69fadbdd4e1-kube-api-access-cd9l2\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.346119 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12353eaa-fb43-415a-b590-f69fadbdd4e1-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.356277 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6" (OuterVolumeSpecName: "kube-api-access-8kkd6") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "kube-api-access-8kkd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.386712 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.398535 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.409563 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.418057 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config" (OuterVolumeSpecName: "config") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.421869 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "528c0fac-d760-4ab1-8e8f-3edf42c51f40" (UID: "528c0fac-d760-4ab1-8e8f-3edf42c51f40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447437 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447462 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447473 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kkd6\" (UniqueName: \"kubernetes.io/projected/528c0fac-d760-4ab1-8e8f-3edf42c51f40-kube-api-access-8kkd6\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447489 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447497 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.447504 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/528c0fac-d760-4ab1-8e8f-3edf42c51f40-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:37 crc kubenswrapper[4618]: I0121 09:18:37.644363 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.019797 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" event={"ID":"528c0fac-d760-4ab1-8e8f-3edf42c51f40","Type":"ContainerDied","Data":"096082e3e75397c88b83c7b74ea0d61f6c87c32d78525e4f464515c6e27806ae"} Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.019852 4618 scope.go:117] "RemoveContainer" containerID="1cfea8e5527ff192e098e92bd19599ab93b794be05d85421784cf1dcdaf59376" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.019981 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-47fmh" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.031454 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerStarted","Data":"7330be96257bbb4c23b634d4224a714b1897828f0bc81f8ca5caea9017ce0fab"} Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.044238 4618 scope.go:117] "RemoveContainer" containerID="27bff08c38c1a8d2f4236396f25924479e28878fb87b359584fce30c74834ea8" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.047575 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerStarted","Data":"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87"} Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.049997 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.052033 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerStarted","Data":"55c53d9ae857fa8bfcc058674e7b2eb54bb50d4f6d172488443ab4da2fbbb02d"} Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.059247 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2tdw" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.062178 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgbvc" event={"ID":"1f724147-bec3-4df8-8f5d-cb9ff9e128e0","Type":"ContainerStarted","Data":"87a40136d868c2824f5fc5bb46b743424c7a3dbf9f6bc49e200f2795c790d5dc"} Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.062975 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l5lv6" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.088675 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.088782 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.103329 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-47fmh"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.107009 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.106997335 podStartE2EDuration="9.106997335s" podCreationTimestamp="2026-01-21 09:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:38.09477018 +0000 UTC m=+916.845237497" watchObservedRunningTime="2026-01-21 09:18:38.106997335 +0000 UTC m=+916.857464652" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.146646 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sdwbk" podStartSLOduration=17.080664381 podStartE2EDuration="21.146631292s" podCreationTimestamp="2026-01-21 09:18:17 +0000 UTC" firstStartedPulling="2026-01-21 09:18:27.681448649 +0000 UTC m=+906.431915966" lastFinishedPulling="2026-01-21 09:18:31.747415559 +0000 UTC m=+910.497882877" observedRunningTime="2026-01-21 09:18:38.1296519 +0000 UTC m=+916.880119216" watchObservedRunningTime="2026-01-21 09:18:38.146631292 +0000 UTC m=+916.897098609" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.212131 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:18:38 crc kubenswrapper[4618]: E0121 09:18:38.212816 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="dnsmasq-dns" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.212828 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="dnsmasq-dns" Jan 21 09:18:38 crc kubenswrapper[4618]: E0121 09:18:38.212845 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d13246f-0095-4316-9769-2173765b9ae6" containerName="keystone-bootstrap" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.212852 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d13246f-0095-4316-9769-2173765b9ae6" containerName="keystone-bootstrap" Jan 21 09:18:38 crc kubenswrapper[4618]: E0121 09:18:38.212863 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" containerName="placement-db-sync" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.212868 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" containerName="placement-db-sync" Jan 21 09:18:38 crc kubenswrapper[4618]: E0121 09:18:38.212880 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="init" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.212886 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="init" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.213046 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" containerName="placement-db-sync" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.213066 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d13246f-0095-4316-9769-2173765b9ae6" containerName="keystone-bootstrap" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.213073 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" containerName="dnsmasq-dns" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.214209 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.288232 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.293907 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wgbvc" podStartSLOduration=2.537189659 podStartE2EDuration="38.293884663s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="2026-01-21 09:18:01.58182494 +0000 UTC m=+880.332292258" lastFinishedPulling="2026-01-21 09:18:37.338519946 +0000 UTC m=+916.088987262" observedRunningTime="2026-01-21 09:18:38.160985737 +0000 UTC m=+916.911453054" watchObservedRunningTime="2026-01-21 09:18:38.293884663 +0000 UTC m=+917.044351980" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.295440 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.295672 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4279m\" (UniqueName: \"kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.295707 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.336105 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76759dfdcd-gxbvm"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.337359 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.343607 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54d488db9b-swfld"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.343721 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.343835 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.343908 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.344117 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.344475 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l8whb" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.344669 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.344683 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.346797 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.351289 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.351443 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.351572 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.351690 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-j9mls" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.356460 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76759dfdcd-gxbvm"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.359667 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54d488db9b-swfld"] Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398175 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-config-data\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398222 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4279m\" (UniqueName: \"kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398241 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-scripts\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398267 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-combined-ca-bundle\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398287 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398315 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-combined-ca-bundle\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398330 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-config-data\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398367 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-fernet-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398402 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-internal-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398432 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-public-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398450 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-credential-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398481 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4634a027-fe25-4458-9f23-b984afd7a60f-logs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398521 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-public-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398543 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz29d\" (UniqueName: \"kubernetes.io/projected/4634a027-fe25-4458-9f23-b984afd7a60f-kube-api-access-fz29d\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398571 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-scripts\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398601 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398623 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfp2\" (UniqueName: \"kubernetes.io/projected/b62e287a-7db9-4d83-aae5-9cc273fff127-kube-api-access-sbfp2\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.398654 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-internal-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.399318 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.399450 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.441056 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4279m\" (UniqueName: \"kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m\") pod \"certified-operators-p4sxs\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500353 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-public-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500391 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz29d\" (UniqueName: \"kubernetes.io/projected/4634a027-fe25-4458-9f23-b984afd7a60f-kube-api-access-fz29d\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500421 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-scripts\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500478 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfp2\" (UniqueName: \"kubernetes.io/projected/b62e287a-7db9-4d83-aae5-9cc273fff127-kube-api-access-sbfp2\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500519 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-internal-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500544 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-config-data\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500562 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-scripts\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500582 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-combined-ca-bundle\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500603 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-combined-ca-bundle\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500616 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-config-data\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500642 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-fernet-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500674 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-internal-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500700 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-public-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500744 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-credential-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.500772 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4634a027-fe25-4458-9f23-b984afd7a60f-logs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.501227 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4634a027-fe25-4458-9f23-b984afd7a60f-logs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.508247 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-combined-ca-bundle\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.509463 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-scripts\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.509664 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-scripts\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.515388 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-public-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.515840 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-public-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.515945 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-internal-tls-certs\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.516557 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-credential-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.517627 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-internal-tls-certs\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.517951 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-config-data\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.518229 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-combined-ca-bundle\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.518790 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b62e287a-7db9-4d83-aae5-9cc273fff127-fernet-keys\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.520051 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4634a027-fe25-4458-9f23-b984afd7a60f-config-data\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.520551 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz29d\" (UniqueName: \"kubernetes.io/projected/4634a027-fe25-4458-9f23-b984afd7a60f-kube-api-access-fz29d\") pod \"placement-54d488db9b-swfld\" (UID: \"4634a027-fe25-4458-9f23-b984afd7a60f\") " pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.530647 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfp2\" (UniqueName: \"kubernetes.io/projected/b62e287a-7db9-4d83-aae5-9cc273fff127-kube-api-access-sbfp2\") pod \"keystone-76759dfdcd-gxbvm\" (UID: \"b62e287a-7db9-4d83-aae5-9cc273fff127\") " pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.540829 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.668891 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:38 crc kubenswrapper[4618]: I0121 09:18:38.677850 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.034679 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.082813 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerStarted","Data":"073a7bad8e185d95f24ac0cad0c1b61f1b06f9a2bb8e40702b0b1359ef1f995d"} Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.200562 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76759dfdcd-gxbvm"] Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.223281 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sdwbk" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" probeResult="failure" output=< Jan 21 09:18:39 crc kubenswrapper[4618]: timeout: failed to connect service ":50051" within 1s Jan 21 09:18:39 crc kubenswrapper[4618]: > Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.405074 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54d488db9b-swfld"] Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.452752 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.511006 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-87c49d4f8-74x7z" podUID="d4a5a9b2-1432-43cc-bfe1-58285caf06ea" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Jan 21 09:18:39 crc kubenswrapper[4618]: I0121 09:18:39.593615 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528c0fac-d760-4ab1-8e8f-3edf42c51f40" path="/var/lib/kubelet/pods/528c0fac-d760-4ab1-8e8f-3edf42c51f40/volumes" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.106031 4618 generic.go:334] "Generic (PLEG): container finished" podID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" containerID="87a40136d868c2824f5fc5bb46b743424c7a3dbf9f6bc49e200f2795c790d5dc" exitCode=0 Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.106209 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgbvc" event={"ID":"1f724147-bec3-4df8-8f5d-cb9ff9e128e0","Type":"ContainerDied","Data":"87a40136d868c2824f5fc5bb46b743424c7a3dbf9f6bc49e200f2795c790d5dc"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.109524 4618 generic.go:334] "Generic (PLEG): container finished" podID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerID="4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804" exitCode=0 Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.109571 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerDied","Data":"4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.112730 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76759dfdcd-gxbvm" event={"ID":"b62e287a-7db9-4d83-aae5-9cc273fff127","Type":"ContainerStarted","Data":"34d6aff3f2b313a817f71d668acf83e8015305a366848f93e53c039a5a411beb"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.112800 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76759dfdcd-gxbvm" event={"ID":"b62e287a-7db9-4d83-aae5-9cc273fff127","Type":"ContainerStarted","Data":"59a2d4ee65452c909203004000f007512988d578d6ebfefddea1b29e4251b86e"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.113292 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.119093 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54d488db9b-swfld" event={"ID":"4634a027-fe25-4458-9f23-b984afd7a60f","Type":"ContainerStarted","Data":"d92a52520a30735f49e909f7742e456801a4f4a92650da39816a26a396cf49a8"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.119135 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.119172 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54d488db9b-swfld" event={"ID":"4634a027-fe25-4458-9f23-b984afd7a60f","Type":"ContainerStarted","Data":"b7913c5efb5aa8f59b15bd5d122840ef6f02c28c7c0e115eb0f2d609d9b77eba"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.119182 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54d488db9b-swfld" event={"ID":"4634a027-fe25-4458-9f23-b984afd7a60f","Type":"ContainerStarted","Data":"6abb1821543946541019fcb7b0b86943d4551d12c782b1e589b426ee87fd6cb8"} Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.119204 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.163392 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54d488db9b-swfld" podStartSLOduration=2.16337451 podStartE2EDuration="2.16337451s" podCreationTimestamp="2026-01-21 09:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:40.154977729 +0000 UTC m=+918.905445045" watchObservedRunningTime="2026-01-21 09:18:40.16337451 +0000 UTC m=+918.913841827" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.176350 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76759dfdcd-gxbvm" podStartSLOduration=2.176329554 podStartE2EDuration="2.176329554s" podCreationTimestamp="2026-01-21 09:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:40.17206286 +0000 UTC m=+918.922530176" watchObservedRunningTime="2026-01-21 09:18:40.176329554 +0000 UTC m=+918.926796871" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.246492 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.246582 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.290037 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:40 crc kubenswrapper[4618]: I0121 09:18:40.305390 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.134278 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerStarted","Data":"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc"} Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.135270 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.135307 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.576197 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.786516 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data\") pod \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.787100 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8r9c\" (UniqueName: \"kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c\") pod \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.787173 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle\") pod \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\" (UID: \"1f724147-bec3-4df8-8f5d-cb9ff9e128e0\") " Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.792628 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f724147-bec3-4df8-8f5d-cb9ff9e128e0" (UID: "1f724147-bec3-4df8-8f5d-cb9ff9e128e0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.797915 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c" (OuterVolumeSpecName: "kube-api-access-z8r9c") pod "1f724147-bec3-4df8-8f5d-cb9ff9e128e0" (UID: "1f724147-bec3-4df8-8f5d-cb9ff9e128e0"). InnerVolumeSpecName "kube-api-access-z8r9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.813917 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f724147-bec3-4df8-8f5d-cb9ff9e128e0" (UID: "1f724147-bec3-4df8-8f5d-cb9ff9e128e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.889364 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8r9c\" (UniqueName: \"kubernetes.io/projected/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-kube-api-access-z8r9c\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.889397 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:41 crc kubenswrapper[4618]: I0121 09:18:41.889407 4618 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f724147-bec3-4df8-8f5d-cb9ff9e128e0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.143265 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wgbvc" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.143287 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wgbvc" event={"ID":"1f724147-bec3-4df8-8f5d-cb9ff9e128e0","Type":"ContainerDied","Data":"2545458992bdcdf39c642793c8085c5a06933294a717e0e941e3b87445383567"} Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.143384 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2545458992bdcdf39c642793c8085c5a06933294a717e0e941e3b87445383567" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.145336 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs7mz" event={"ID":"46a5bcfd-6e6c-4070-b7e2-b2e90789f888","Type":"ContainerStarted","Data":"a398c0974e3acf51dc424a7d06423325756541e8a3ba87270615b3e91608837f"} Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.155050 4618 generic.go:334] "Generic (PLEG): container finished" podID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerID="ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc" exitCode=0 Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.156681 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerDied","Data":"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc"} Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.172691 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vs7mz" podStartSLOduration=2.519823529 podStartE2EDuration="42.172678789s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="2026-01-21 09:18:01.477408501 +0000 UTC m=+880.227875818" lastFinishedPulling="2026-01-21 09:18:41.130263761 +0000 UTC m=+919.880731078" observedRunningTime="2026-01-21 09:18:42.16278935 +0000 UTC m=+920.913256667" watchObservedRunningTime="2026-01-21 09:18:42.172678789 +0000 UTC m=+920.923146106" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.849282 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5df6b49cb5-9npwx"] Jan 21 09:18:42 crc kubenswrapper[4618]: E0121 09:18:42.849815 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" containerName="barbican-db-sync" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.849830 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" containerName="barbican-db-sync" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.850062 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" containerName="barbican-db-sync" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.851137 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.857603 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.857623 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2fbfh" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.857761 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.860222 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c95ff478-nbsz8"] Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.862180 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.865762 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5df6b49cb5-9npwx"] Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.867043 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.868401 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c95ff478-nbsz8"] Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.970425 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.972347 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:42 crc kubenswrapper[4618]: I0121 09:18:42.987552 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018677 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018739 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxb9\" (UniqueName: \"kubernetes.io/projected/4c1702d5-7295-4662-956a-180ac3b7c04d-kube-api-access-2qxb9\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018862 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-logs\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018912 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-combined-ca-bundle\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018940 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data-custom\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018961 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data-custom\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.018987 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1702d5-7295-4662-956a-180ac3b7c04d-logs\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.019045 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-combined-ca-bundle\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.019117 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57b8h\" (UniqueName: \"kubernetes.io/projected/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-kube-api-access-57b8h\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.019180 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.057437 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.059634 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.065632 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.074243 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.123561 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.123622 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2d9f\" (UniqueName: \"kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.123690 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.123717 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxb9\" (UniqueName: \"kubernetes.io/projected/4c1702d5-7295-4662-956a-180ac3b7c04d-kube-api-access-2qxb9\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124161 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-logs\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124202 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124224 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-combined-ca-bundle\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124242 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data-custom\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124268 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data-custom\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124295 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1702d5-7295-4662-956a-180ac3b7c04d-logs\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124333 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124349 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124385 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-combined-ca-bundle\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124433 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57b8h\" (UniqueName: \"kubernetes.io/projected/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-kube-api-access-57b8h\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.124522 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.127793 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-logs\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.127816 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1702d5-7295-4662-956a-180ac3b7c04d-logs\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.132777 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.137284 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.139050 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-combined-ca-bundle\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.140580 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-combined-ca-bundle\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.141825 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c1702d5-7295-4662-956a-180ac3b7c04d-config-data-custom\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.142635 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-config-data-custom\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.151555 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxb9\" (UniqueName: \"kubernetes.io/projected/4c1702d5-7295-4662-956a-180ac3b7c04d-kube-api-access-2qxb9\") pod \"barbican-keystone-listener-7c95ff478-nbsz8\" (UID: \"4c1702d5-7295-4662-956a-180ac3b7c04d\") " pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.151768 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57b8h\" (UniqueName: \"kubernetes.io/projected/65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b-kube-api-access-57b8h\") pod \"barbican-worker-5df6b49cb5-9npwx\" (UID: \"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b\") " pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.163429 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.178966 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5df6b49cb5-9npwx" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.187848 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227234 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227285 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227328 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227354 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227370 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2d9f\" (UniqueName: \"kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227387 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227409 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227453 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqltf\" (UniqueName: \"kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227484 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.227924 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.228172 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.228205 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.228186 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.228326 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.228682 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.229669 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.243926 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2d9f\" (UniqueName: \"kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f\") pod \"dnsmasq-dns-66cdd4b5b5-rlgjs\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.300499 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.330190 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.330323 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.330376 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.330426 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.330551 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqltf\" (UniqueName: \"kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.335730 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.336708 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.337227 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.353577 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.369985 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqltf\" (UniqueName: \"kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf\") pod \"barbican-api-6d46c4985b-6wf5b\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.382352 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:43 crc kubenswrapper[4618]: I0121 09:18:43.813292 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:44 crc kubenswrapper[4618]: I0121 09:18:44.171177 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:18:44 crc kubenswrapper[4618]: I0121 09:18:44.800108 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.184100 4618 generic.go:334] "Generic (PLEG): container finished" podID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" containerID="a398c0974e3acf51dc424a7d06423325756541e8a3ba87270615b3e91608837f" exitCode=0 Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.184182 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs7mz" event={"ID":"46a5bcfd-6e6c-4070-b7e2-b2e90789f888","Type":"ContainerDied","Data":"a398c0974e3acf51dc424a7d06423325756541e8a3ba87270615b3e91608837f"} Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.468526 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68d7cbc6d4-mthph"] Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.469840 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.473904 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.474102 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.485625 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68d7cbc6d4-mthph"] Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575259 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-internal-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575358 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add10569-0b7d-47e6-a9fc-943ff2f54fc4-logs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575395 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data-custom\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575425 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-public-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575451 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575468 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlnng\" (UniqueName: \"kubernetes.io/projected/add10569-0b7d-47e6-a9fc-943ff2f54fc4-kube-api-access-rlnng\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.575534 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-combined-ca-bundle\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.677520 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data-custom\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.678552 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-public-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.678601 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.678626 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlnng\" (UniqueName: \"kubernetes.io/projected/add10569-0b7d-47e6-a9fc-943ff2f54fc4-kube-api-access-rlnng\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.678726 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-combined-ca-bundle\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.678941 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-internal-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.679072 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add10569-0b7d-47e6-a9fc-943ff2f54fc4-logs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.680845 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add10569-0b7d-47e6-a9fc-943ff2f54fc4-logs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.683957 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-combined-ca-bundle\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.687989 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data-custom\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.689507 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-internal-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.689751 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-public-tls-certs\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.690240 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add10569-0b7d-47e6-a9fc-943ff2f54fc4-config-data\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.694049 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlnng\" (UniqueName: \"kubernetes.io/projected/add10569-0b7d-47e6-a9fc-943ff2f54fc4-kube-api-access-rlnng\") pod \"barbican-api-68d7cbc6d4-mthph\" (UID: \"add10569-0b7d-47e6-a9fc-943ff2f54fc4\") " pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:45 crc kubenswrapper[4618]: I0121 09:18:45.794916 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.811197 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936003 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936107 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936318 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936530 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936638 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.936681 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmc9d\" (UniqueName: \"kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d\") pod \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\" (UID: \"46a5bcfd-6e6c-4070-b7e2-b2e90789f888\") " Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.937069 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.938835 4618 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.944650 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.947189 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d" (OuterVolumeSpecName: "kube-api-access-zmc9d") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "kube-api-access-zmc9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.954575 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts" (OuterVolumeSpecName: "scripts") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.971518 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:47 crc kubenswrapper[4618]: I0121 09:18:47.990135 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data" (OuterVolumeSpecName: "config-data") pod "46a5bcfd-6e6c-4070-b7e2-b2e90789f888" (UID: "46a5bcfd-6e6c-4070-b7e2-b2e90789f888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.041284 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.041317 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.041331 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.041342 4618 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.041354 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmc9d\" (UniqueName: \"kubernetes.io/projected/46a5bcfd-6e6c-4070-b7e2-b2e90789f888-kube-api-access-zmc9d\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.229022 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vs7mz" event={"ID":"46a5bcfd-6e6c-4070-b7e2-b2e90789f888","Type":"ContainerDied","Data":"9b29f209a01559e4bc231bf58e6f8c1bf96701266aa69313731d4e58c3739e5b"} Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.229071 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b29f209a01559e4bc231bf58e6f8c1bf96701266aa69313731d4e58c3739e5b" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.229095 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vs7mz" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.666114 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.806512 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.874048 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.936046 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c95ff478-nbsz8"] Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.952702 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5df6b49cb5-9npwx"] Jan 21 09:18:48 crc kubenswrapper[4618]: W0121 09:18:48.966385 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd10569_0b7d_47e6_a9fc_943ff2f54fc4.slice/crio-42e0d9b70ad5bbbed3f82ee98c8bc26274673ced0facbf0b9212b6f8f7e2190c WatchSource:0}: Error finding container 42e0d9b70ad5bbbed3f82ee98c8bc26274673ced0facbf0b9212b6f8f7e2190c: Status 404 returned error can't find the container with id 42e0d9b70ad5bbbed3f82ee98c8bc26274673ced0facbf0b9212b6f8f7e2190c Jan 21 09:18:48 crc kubenswrapper[4618]: I0121 09:18:48.970193 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68d7cbc6d4-mthph"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.052298 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:18:49 crc kubenswrapper[4618]: E0121 09:18:49.052882 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" containerName="cinder-db-sync" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.052972 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" containerName="cinder-db-sync" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.053260 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" containerName="cinder-db-sync" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.054245 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.060017 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.060228 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vdpf6" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.060356 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.060521 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.063281 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.072766 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.140112 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.141675 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.145335 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sdwbk" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" probeResult="failure" output=< Jan 21 09:18:49 crc kubenswrapper[4618]: timeout: failed to connect service ":50051" within 1s Jan 21 09:18:49 crc kubenswrapper[4618]: > Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.158353 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.170677 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.172221 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.172278 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.172359 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.172417 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvfdf\" (UniqueName: \"kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.172480 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.238445 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.239922 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.244121 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.252891 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274056 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjwx\" (UniqueName: \"kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274117 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274164 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274199 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274236 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274256 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274300 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274316 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274330 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274346 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274378 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274421 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274446 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvfdf\" (UniqueName: \"kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274530 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274548 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274586 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274619 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.274647 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kqq\" (UniqueName: \"kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.275161 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.279290 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.279538 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.281400 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.283002 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.291461 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68d7cbc6d4-mthph" event={"ID":"add10569-0b7d-47e6-a9fc-943ff2f54fc4","Type":"ContainerStarted","Data":"42e0d9b70ad5bbbed3f82ee98c8bc26274673ced0facbf0b9212b6f8f7e2190c"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.297525 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" event={"ID":"4c1702d5-7295-4662-956a-180ac3b7c04d","Type":"ContainerStarted","Data":"b4b56107233d10497cdb802095e3c5bd0b1a36b95b4ebcbfff4776a2f9929ed8"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.297912 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvfdf\" (UniqueName: \"kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf\") pod \"cinder-scheduler-0\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.302901 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" event={"ID":"7d0dfb5b-cbab-4646-8b56-cac2978270f8","Type":"ContainerStarted","Data":"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.302934 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" event={"ID":"7d0dfb5b-cbab-4646-8b56-cac2978270f8","Type":"ContainerStarted","Data":"12d1bc84601fa97b154c2f7585508880b328c3e7ead27d4df91346f7e0063f6f"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.317436 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerStarted","Data":"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.317477 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerStarted","Data":"51371dacd2de0af42c1623072961a550b90ae8e346c18b5da0b598511fdba2cc"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.325267 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerStarted","Data":"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328022 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerStarted","Data":"d604199e5d78b2f344d39ad917fe5d55b59be49607e3aa15cadef8cb502e85c7"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328172 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-central-agent" containerID="cri-o://ff15d193ebe18d38e17a22e7ad3bef679e52c9d89e412a90ade1da5b3d0e1a62" gracePeriod=30 Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328382 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328431 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="proxy-httpd" containerID="cri-o://d604199e5d78b2f344d39ad917fe5d55b59be49607e3aa15cadef8cb502e85c7" gracePeriod=30 Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328474 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="sg-core" containerID="cri-o://7330be96257bbb4c23b634d4224a714b1897828f0bc81f8ca5caea9017ce0fab" gracePeriod=30 Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.328535 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-notification-agent" containerID="cri-o://aaa108acefc96795f5d138b3c0cc4cc548eae8a0d621bf454f3de5cc2b41820d" gracePeriod=30 Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.334426 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df6b49cb5-9npwx" event={"ID":"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b","Type":"ContainerStarted","Data":"c368ee379b537c9bb152ea9e2aba1842f03ba3380b93844ec8a7bb0dd40ed0f4"} Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.355821 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4sxs" podStartSLOduration=3.070504495 podStartE2EDuration="11.355804072s" podCreationTimestamp="2026-01-21 09:18:38 +0000 UTC" firstStartedPulling="2026-01-21 09:18:40.110870008 +0000 UTC m=+918.861337325" lastFinishedPulling="2026-01-21 09:18:48.396169585 +0000 UTC m=+927.146636902" observedRunningTime="2026-01-21 09:18:49.347913022 +0000 UTC m=+928.098380339" watchObservedRunningTime="2026-01-21 09:18:49.355804072 +0000 UTC m=+928.106271389" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384153 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kqq\" (UniqueName: \"kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384197 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjwx\" (UniqueName: \"kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384238 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384259 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384282 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384307 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384335 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384349 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384365 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384440 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384463 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.384486 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.385124 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.385216 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.385243 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.385658 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.388689 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.390588 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.391087 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.395823 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.397785 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.397789 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.398074 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.398452 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.405692 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjwx\" (UniqueName: \"kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx\") pod \"cinder-api-0\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " pod="openstack/cinder-api-0" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.424717 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kqq\" (UniqueName: \"kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq\") pod \"dnsmasq-dns-75dbb546bf-rjwsk\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.483589 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:49 crc kubenswrapper[4618]: I0121 09:18:49.651867 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.051064 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.626074259 podStartE2EDuration="50.051035934s" podCreationTimestamp="2026-01-21 09:18:00 +0000 UTC" firstStartedPulling="2026-01-21 09:18:01.970378961 +0000 UTC m=+880.720846278" lastFinishedPulling="2026-01-21 09:18:48.395340636 +0000 UTC m=+927.145807953" observedRunningTime="2026-01-21 09:18:49.398232354 +0000 UTC m=+928.148699671" watchObservedRunningTime="2026-01-21 09:18:50.051035934 +0000 UTC m=+928.801503250" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.052101 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.061364 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.185915 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222345 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222607 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2d9f\" (UniqueName: \"kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222675 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222706 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222759 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.222781 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config\") pod \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\" (UID: \"7d0dfb5b-cbab-4646-8b56-cac2978270f8\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.237618 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f" (OuterVolumeSpecName: "kube-api-access-x2d9f") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "kube-api-access-x2d9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.261913 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config" (OuterVolumeSpecName: "config") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.275460 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.279657 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.294698 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.295396 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.322520 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d0dfb5b-cbab-4646-8b56-cac2978270f8" (UID: "7d0dfb5b-cbab-4646-8b56-cac2978270f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.324798 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.324880 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.324935 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.324983 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.325039 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0dfb5b-cbab-4646-8b56-cac2978270f8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.325086 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2d9f\" (UniqueName: \"kubernetes.io/projected/7d0dfb5b-cbab-4646-8b56-cac2978270f8-kube-api-access-x2d9f\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.360982 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68d7cbc6d4-mthph" event={"ID":"add10569-0b7d-47e6-a9fc-943ff2f54fc4","Type":"ContainerStarted","Data":"06738448e87f3e84a40b43d66cf04cb5899758d7e51382c88402767e10d3726e"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.363702 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68d7cbc6d4-mthph" event={"ID":"add10569-0b7d-47e6-a9fc-943ff2f54fc4","Type":"ContainerStarted","Data":"52947050f77f4027d984ca7fc50856dadfd737b00b9b0606fad3c14023ef7774"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.363739 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.363751 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.365617 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerStarted","Data":"dfadf146e354a227a3f3c2b54749e1867c345bb2bed592d53106a4de3e77841e"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.368466 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.374967 4618 generic.go:334] "Generic (PLEG): container finished" podID="7d0dfb5b-cbab-4646-8b56-cac2978270f8" containerID="2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305" exitCode=0 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.375036 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" event={"ID":"7d0dfb5b-cbab-4646-8b56-cac2978270f8","Type":"ContainerDied","Data":"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.375054 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cdd4b5b5-rlgjs" event={"ID":"7d0dfb5b-cbab-4646-8b56-cac2978270f8","Type":"ContainerDied","Data":"12d1bc84601fa97b154c2f7585508880b328c3e7ead27d4df91346f7e0063f6f"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.375069 4618 scope.go:117] "RemoveContainer" containerID="2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.386642 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68d7cbc6d4-mthph" podStartSLOduration=5.386623814 podStartE2EDuration="5.386623814s" podCreationTimestamp="2026-01-21 09:18:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:50.386084098 +0000 UTC m=+929.136551415" watchObservedRunningTime="2026-01-21 09:18:50.386623814 +0000 UTC m=+929.137091130" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.391547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerStarted","Data":"71447826a89a0ff394d7d31d5635767db2caf8bececa13d7331771a51ecb154f"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.404348 4618 scope.go:117] "RemoveContainer" containerID="2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.404460 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerStarted","Data":"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.404550 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.404575 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:18:50 crc kubenswrapper[4618]: E0121 09:18:50.408616 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305\": container with ID starting with 2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305 not found: ID does not exist" containerID="2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.408649 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305"} err="failed to get container status \"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305\": rpc error: code = NotFound desc = could not find container \"2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305\": container with ID starting with 2f6049b265cfebb480d675ba8089aa5e43b1e364670771e73dbbfdb38b758305 not found: ID does not exist" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.427528 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" event={"ID":"32a4f5ed-f364-473e-b70e-736ff25ad7cd","Type":"ContainerStarted","Data":"cae5a4f64795b9dc8f99228b5c7ac9cd68115d368d216c1d6b8f05a819b06096"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.451939 4618 generic.go:334] "Generic (PLEG): container finished" podID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerID="d604199e5d78b2f344d39ad917fe5d55b59be49607e3aa15cadef8cb502e85c7" exitCode=0 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.451970 4618 generic.go:334] "Generic (PLEG): container finished" podID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerID="7330be96257bbb4c23b634d4224a714b1897828f0bc81f8ca5caea9017ce0fab" exitCode=2 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.451977 4618 generic.go:334] "Generic (PLEG): container finished" podID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerID="aaa108acefc96795f5d138b3c0cc4cc548eae8a0d621bf454f3de5cc2b41820d" exitCode=0 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.451984 4618 generic.go:334] "Generic (PLEG): container finished" podID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerID="ff15d193ebe18d38e17a22e7ad3bef679e52c9d89e412a90ade1da5b3d0e1a62" exitCode=0 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.452828 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.452856 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerDied","Data":"d604199e5d78b2f344d39ad917fe5d55b59be49607e3aa15cadef8cb502e85c7"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.452881 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerDied","Data":"7330be96257bbb4c23b634d4224a714b1897828f0bc81f8ca5caea9017ce0fab"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.452892 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerDied","Data":"aaa108acefc96795f5d138b3c0cc4cc548eae8a0d621bf454f3de5cc2b41820d"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.452902 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerDied","Data":"ff15d193ebe18d38e17a22e7ad3bef679e52c9d89e412a90ade1da5b3d0e1a62"} Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.454230 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.461036 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cdd4b5b5-rlgjs"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.465670 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d46c4985b-6wf5b" podStartSLOduration=7.4656478889999995 podStartE2EDuration="7.465647889s" podCreationTimestamp="2026-01-21 09:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:50.437566928 +0000 UTC m=+929.188034245" watchObservedRunningTime="2026-01-21 09:18:50.465647889 +0000 UTC m=+929.216115196" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.629901 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630004 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630073 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630112 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69gxl\" (UniqueName: \"kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630159 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630252 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630274 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630317 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data\") pod \"2eb33687-2287-42d0-aa55-a0aadf31dcca\" (UID: \"2eb33687-2287-42d0-aa55-a0aadf31dcca\") " Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.630590 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.631045 4618 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.631062 4618 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eb33687-2287-42d0-aa55-a0aadf31dcca-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.638282 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts" (OuterVolumeSpecName: "scripts") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.638291 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl" (OuterVolumeSpecName: "kube-api-access-69gxl") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "kube-api-access-69gxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.659239 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.713680 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.732834 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.732878 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69gxl\" (UniqueName: \"kubernetes.io/projected/2eb33687-2287-42d0-aa55-a0aadf31dcca-kube-api-access-69gxl\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.732891 4618 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.732901 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.755072 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data" (OuterVolumeSpecName: "config-data") pod "2eb33687-2287-42d0-aa55-a0aadf31dcca" (UID: "2eb33687-2287-42d0-aa55-a0aadf31dcca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.783703 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58985598b5-rf45g" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.836304 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.836530 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69f66b98c6-zmvmx" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-api" containerID="cri-o://984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de" gracePeriod=30 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.836913 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69f66b98c6-zmvmx" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-httpd" containerID="cri-o://617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9" gracePeriod=30 Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.837814 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb33687-2287-42d0-aa55-a0aadf31dcca-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:50 crc kubenswrapper[4618]: I0121 09:18:50.924947 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.173644 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b0e8004_6943_476d_9fab_36846adbc5de.slice/crio-617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9.scope\": RecentStats: unable to find data in memory cache]" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.464878 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerStarted","Data":"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965"} Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.466280 4618 generic.go:334] "Generic (PLEG): container finished" podID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerID="9ad43541e15c378bd8b57317352791a435129bdd2c01ab5aa240a790edf94567" exitCode=0 Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.466325 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" event={"ID":"32a4f5ed-f364-473e-b70e-736ff25ad7cd","Type":"ContainerDied","Data":"9ad43541e15c378bd8b57317352791a435129bdd2c01ab5aa240a790edf94567"} Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.479410 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eb33687-2287-42d0-aa55-a0aadf31dcca","Type":"ContainerDied","Data":"f4ad97bdfcd096624bc1d7458788c98fd64faaf813b5cef598b4075afbcf1f35"} Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.479444 4618 scope.go:117] "RemoveContainer" containerID="d604199e5d78b2f344d39ad917fe5d55b59be49607e3aa15cadef8cb502e85c7" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.479555 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.489022 4618 generic.go:334] "Generic (PLEG): container finished" podID="7b0e8004-6943-476d-9fab-36846adbc5de" containerID="617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9" exitCode=0 Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.489099 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerDied","Data":"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9"} Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.572019 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0dfb5b-cbab-4646-8b56-cac2978270f8" path="/var/lib/kubelet/pods/7d0dfb5b-cbab-4646-8b56-cac2978270f8/volumes" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.572991 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.604488 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.626533 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.626948 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-notification-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.626965 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-notification-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.626978 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0dfb5b-cbab-4646-8b56-cac2978270f8" containerName="init" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.626984 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0dfb5b-cbab-4646-8b56-cac2978270f8" containerName="init" Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.626991 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="sg-core" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.626996 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="sg-core" Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.627010 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="proxy-httpd" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627015 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="proxy-httpd" Jan 21 09:18:51 crc kubenswrapper[4618]: E0121 09:18:51.627044 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-central-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627055 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-central-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627284 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0dfb5b-cbab-4646-8b56-cac2978270f8" containerName="init" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627297 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-central-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627307 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="proxy-httpd" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627314 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="sg-core" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.627327 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" containerName="ceilometer-notification-agent" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.628957 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.630911 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.631176 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.651448 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.726417 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.729998 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.762816 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.762866 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.762898 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.762962 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.762988 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8g8q\" (UniqueName: \"kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.763005 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.763024 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866019 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866065 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866118 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866279 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866327 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8g8q\" (UniqueName: \"kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866362 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.866402 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.871258 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.872536 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.874963 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.891521 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.891765 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.898010 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.902126 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8g8q\" (UniqueName: \"kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q\") pod \"ceilometer-0\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " pod="openstack/ceilometer-0" Jan 21 09:18:51 crc kubenswrapper[4618]: I0121 09:18:51.949279 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.066400 4618 scope.go:117] "RemoveContainer" containerID="7330be96257bbb4c23b634d4224a714b1897828f0bc81f8ca5caea9017ce0fab" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.299586 4618 scope.go:117] "RemoveContainer" containerID="aaa108acefc96795f5d138b3c0cc4cc548eae8a0d621bf454f3de5cc2b41820d" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.356951 4618 scope.go:117] "RemoveContainer" containerID="ff15d193ebe18d38e17a22e7ad3bef679e52c9d89e412a90ade1da5b3d0e1a62" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.528134 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" event={"ID":"32a4f5ed-f364-473e-b70e-736ff25ad7cd","Type":"ContainerStarted","Data":"b4030d3306f46a34332c6f7410af9b79ae2faab4495a8e8cc91750cb850a5769"} Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.529405 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.548942 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" podStartSLOduration=3.54892826 podStartE2EDuration="3.54892826s" podCreationTimestamp="2026-01-21 09:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:52.546558893 +0000 UTC m=+931.297026210" watchObservedRunningTime="2026-01-21 09:18:52.54892826 +0000 UTC m=+931.299395577" Jan 21 09:18:52 crc kubenswrapper[4618]: I0121 09:18:52.749615 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.584839 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5df6b49cb5-9npwx" podStartSLOduration=8.15147932 podStartE2EDuration="11.584822463s" podCreationTimestamp="2026-01-21 09:18:42 +0000 UTC" firstStartedPulling="2026-01-21 09:18:48.96307506 +0000 UTC m=+927.713542377" lastFinishedPulling="2026-01-21 09:18:52.396418203 +0000 UTC m=+931.146885520" observedRunningTime="2026-01-21 09:18:53.579812222 +0000 UTC m=+932.330279539" watchObservedRunningTime="2026-01-21 09:18:53.584822463 +0000 UTC m=+932.335289780" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.607796 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb33687-2287-42d0-aa55-a0aadf31dcca" path="/var/lib/kubelet/pods/2eb33687-2287-42d0-aa55-a0aadf31dcca/volumes" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.608953 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.609000 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df6b49cb5-9npwx" event={"ID":"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b","Type":"ContainerStarted","Data":"1911fe193a839391fa90453fc3869bf448cea8ce33e6e867cbcbe695f2b028e5"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.609021 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df6b49cb5-9npwx" event={"ID":"65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b","Type":"ContainerStarted","Data":"38ebe308d7e748f27e75e03ef8ceefc0eb719b6616ae3de912dcbebbce05fbc9"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.609592 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerStarted","Data":"f4aa7fb21018b58c5b7bdea2e2db928579996999c8ff9bf75e776ee65541237c"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.621154 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" event={"ID":"4c1702d5-7295-4662-956a-180ac3b7c04d","Type":"ContainerStarted","Data":"61f57fd14c72136e9590b92e2bcf64c2d9866e0228e3025f1b32508db6c058ed"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.621276 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" event={"ID":"4c1702d5-7295-4662-956a-180ac3b7c04d","Type":"ContainerStarted","Data":"8dfdee8d736bbdd0c2d8712e882d1b12c7e5f910dd12fbed9a905bfa00872aa1"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.658953 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerStarted","Data":"35c9eb62674b9f64c9e3fa6345544f11dc4cb87fa001cf46ffb1c2594a99e950"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.677175 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerStarted","Data":"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33"} Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.677726 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api-log" containerID="cri-o://ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" gracePeriod=30 Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.677904 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.678287 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api" containerID="cri-o://51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" gracePeriod=30 Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.700616 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c95ff478-nbsz8" podStartSLOduration=8.367373089 podStartE2EDuration="11.70057896s" podCreationTimestamp="2026-01-21 09:18:42 +0000 UTC" firstStartedPulling="2026-01-21 09:18:48.966380837 +0000 UTC m=+927.716848144" lastFinishedPulling="2026-01-21 09:18:52.299586698 +0000 UTC m=+931.050054015" observedRunningTime="2026-01-21 09:18:53.655855592 +0000 UTC m=+932.406322909" watchObservedRunningTime="2026-01-21 09:18:53.70057896 +0000 UTC m=+932.451046277" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.748023 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.748002887 podStartE2EDuration="4.748002887s" podCreationTimestamp="2026-01-21 09:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:53.705065267 +0000 UTC m=+932.455532585" watchObservedRunningTime="2026-01-21 09:18:53.748002887 +0000 UTC m=+932.498470194" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.768285 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.770068 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.790178 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.864962 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-87c49d4f8-74x7z" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.916778 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.931478 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48mph\" (UniqueName: \"kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.931642 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:53 crc kubenswrapper[4618]: I0121 09:18:53.931670 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.033805 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.033856 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.033972 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48mph\" (UniqueName: \"kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.034609 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.034638 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.050812 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48mph\" (UniqueName: \"kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph\") pod \"community-operators-tldbm\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.098161 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.396306 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546160 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546263 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546299 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjwx\" (UniqueName: \"kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546319 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546342 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546505 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.546574 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data\") pod \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\" (UID: \"3fc2ee65-7f27-4673-af65-1452cb65b3a5\") " Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.550154 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs" (OuterVolumeSpecName: "logs") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.550623 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx" (OuterVolumeSpecName: "kube-api-access-7cjwx") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "kube-api-access-7cjwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.550827 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.550912 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts" (OuterVolumeSpecName: "scripts") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.552801 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.582829 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.603729 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data" (OuterVolumeSpecName: "config-data") pod "3fc2ee65-7f27-4673-af65-1452cb65b3a5" (UID: "3fc2ee65-7f27-4673-af65-1452cb65b3a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.648991 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fc2ee65-7f27-4673-af65-1452cb65b3a5-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649728 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649744 4618 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649756 4618 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fc2ee65-7f27-4673-af65-1452cb65b3a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649765 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjwx\" (UniqueName: \"kubernetes.io/projected/3fc2ee65-7f27-4673-af65-1452cb65b3a5-kube-api-access-7cjwx\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649774 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.649781 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc2ee65-7f27-4673-af65-1452cb65b3a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.687247 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerStarted","Data":"23a78fba194b02a5e7856f89511758e90d608bb6a0965f2bf6a1fce3f00763f6"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.689681 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerStarted","Data":"baa19d216905e3374a20be89dd594efda419f21f7c345ef9ec8687bfc85c6046"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.689723 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerStarted","Data":"0d0c716bc5e69ae3e0aa858cfc309ae774ec3fab70e4cf1e0841d7e8826bb0bd"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.691216 4618 generic.go:334] "Generic (PLEG): container finished" podID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerID="51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" exitCode=0 Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.691239 4618 generic.go:334] "Generic (PLEG): container finished" podID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerID="ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" exitCode=143 Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.691748 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692317 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerDied","Data":"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692411 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerDied","Data":"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692425 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3fc2ee65-7f27-4673-af65-1452cb65b3a5","Type":"ContainerDied","Data":"71447826a89a0ff394d7d31d5635767db2caf8bececa13d7331771a51ecb154f"} Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692448 4618 scope.go:117] "RemoveContainer" containerID="51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692799 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon-log" containerID="cri-o://9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7" gracePeriod=30 Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.692912 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" containerID="cri-o://8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074" gracePeriod=30 Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.724335 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.404102135 podStartE2EDuration="5.724316452s" podCreationTimestamp="2026-01-21 09:18:49 +0000 UTC" firstStartedPulling="2026-01-21 09:18:50.101265687 +0000 UTC m=+928.851733004" lastFinishedPulling="2026-01-21 09:18:52.421480005 +0000 UTC m=+931.171947321" observedRunningTime="2026-01-21 09:18:54.723754966 +0000 UTC m=+933.474222283" watchObservedRunningTime="2026-01-21 09:18:54.724316452 +0000 UTC m=+933.474783768" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.739277 4618 scope.go:117] "RemoveContainer" containerID="ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.739492 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.744016 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.749588 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.756983 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:54 crc kubenswrapper[4618]: E0121 09:18:54.757369 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api-log" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.757389 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api-log" Jan 21 09:18:54 crc kubenswrapper[4618]: E0121 09:18:54.757414 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.757421 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.757597 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api-log" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.757621 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" containerName="cinder-api" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.758548 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.760983 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.761249 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.761321 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.792629 4618 scope.go:117] "RemoveContainer" containerID="51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" Jan 21 09:18:54 crc kubenswrapper[4618]: E0121 09:18:54.792971 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33\": container with ID starting with 51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33 not found: ID does not exist" containerID="51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793006 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33"} err="failed to get container status \"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33\": rpc error: code = NotFound desc = could not find container \"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33\": container with ID starting with 51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33 not found: ID does not exist" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793030 4618 scope.go:117] "RemoveContainer" containerID="ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" Jan 21 09:18:54 crc kubenswrapper[4618]: E0121 09:18:54.793297 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965\": container with ID starting with ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965 not found: ID does not exist" containerID="ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793327 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965"} err="failed to get container status \"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965\": rpc error: code = NotFound desc = could not find container \"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965\": container with ID starting with ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965 not found: ID does not exist" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793339 4618 scope.go:117] "RemoveContainer" containerID="51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793494 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33"} err="failed to get container status \"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33\": rpc error: code = NotFound desc = could not find container \"51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33\": container with ID starting with 51f85ef85b725b9d74b49ba4f899be5651e186661600ef885e682b84ad825e33 not found: ID does not exist" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793521 4618 scope.go:117] "RemoveContainer" containerID="ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.793678 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965"} err="failed to get container status \"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965\": rpc error: code = NotFound desc = could not find container \"ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965\": container with ID starting with ec56e0c280b8969f9578ad8aff12ce2b1af68f37e9c31bde2a834ff0005cc965 not found: ID does not exist" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.796324 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.853499 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c710262-7141-4edf-8f70-b5ee3d235970-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.853551 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854121 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854276 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-scripts\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854303 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854433 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c710262-7141-4edf-8f70-b5ee3d235970-logs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854685 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854745 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.854809 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s49lv\" (UniqueName: \"kubernetes.io/projected/3c710262-7141-4edf-8f70-b5ee3d235970-kube-api-access-s49lv\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.956615 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c710262-7141-4edf-8f70-b5ee3d235970-logs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.956983 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957022 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957064 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s49lv\" (UniqueName: \"kubernetes.io/projected/3c710262-7141-4edf-8f70-b5ee3d235970-kube-api-access-s49lv\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957176 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c710262-7141-4edf-8f70-b5ee3d235970-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957196 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957271 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957309 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-scripts\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.957324 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.958720 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c710262-7141-4edf-8f70-b5ee3d235970-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.959008 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c710262-7141-4edf-8f70-b5ee3d235970-logs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.968887 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.970063 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.973582 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.974676 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.975658 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.979496 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c710262-7141-4edf-8f70-b5ee3d235970-scripts\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:54 crc kubenswrapper[4618]: I0121 09:18:54.984635 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s49lv\" (UniqueName: \"kubernetes.io/projected/3c710262-7141-4edf-8f70-b5ee3d235970-kube-api-access-s49lv\") pod \"cinder-api-0\" (UID: \"3c710262-7141-4edf-8f70-b5ee3d235970\") " pod="openstack/cinder-api-0" Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.074405 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.555354 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc2ee65-7f27-4673-af65-1452cb65b3a5" path="/var/lib/kubelet/pods/3fc2ee65-7f27-4673-af65-1452cb65b3a5/volumes" Jan 21 09:18:55 crc kubenswrapper[4618]: W0121 09:18:55.584220 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c710262_7141_4edf_8f70_b5ee3d235970.slice/crio-ba1a63da21a0fd6b98acfdcbbdc16ac0307fb049a948eab055e277ad80859a6f WatchSource:0}: Error finding container ba1a63da21a0fd6b98acfdcbbdc16ac0307fb049a948eab055e277ad80859a6f: Status 404 returned error can't find the container with id ba1a63da21a0fd6b98acfdcbbdc16ac0307fb049a948eab055e277ad80859a6f Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.585309 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.724907 4618 generic.go:334] "Generic (PLEG): container finished" podID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerID="f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c" exitCode=0 Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.725206 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerDied","Data":"f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c"} Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.725234 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerStarted","Data":"39d9984078e6869c34ec24cfc53701d675ee1a51d36f0c2b549422863760d6d3"} Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.733863 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3c710262-7141-4edf-8f70-b5ee3d235970","Type":"ContainerStarted","Data":"ba1a63da21a0fd6b98acfdcbbdc16ac0307fb049a948eab055e277ad80859a6f"} Jan 21 09:18:55 crc kubenswrapper[4618]: I0121 09:18:55.755275 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerStarted","Data":"a2825559adc6707e7f5bb3329894c71e58af4f56342f9e8f13341a4cfc919cc1"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.783430 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3c710262-7141-4edf-8f70-b5ee3d235970","Type":"ContainerStarted","Data":"341cc088ab9a43dc4607569d787d702b3deaac66f71677f320d9c43e27943294"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.787937 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerStarted","Data":"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.797469 4618 generic.go:334] "Generic (PLEG): container finished" podID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerID="254815b64f63b11b08029d4b7856eadb7d21d7ec68730bdabed22c3ab54370a8" exitCode=137 Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.797521 4618 generic.go:334] "Generic (PLEG): container finished" podID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerID="c3ee741a332d0d27e315f8c53be9b88fa52fc79bc463094f0be8a9a5b958987c" exitCode=137 Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.797565 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerDied","Data":"254815b64f63b11b08029d4b7856eadb7d21d7ec68730bdabed22c3ab54370a8"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.797610 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerDied","Data":"c3ee741a332d0d27e315f8c53be9b88fa52fc79bc463094f0be8a9a5b958987c"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.809388 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerStarted","Data":"ac778dbfaff7210fda0908dbfdb4dfbf29e869c826ae63523e69c48be89afeb5"} Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.810554 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:18:56 crc kubenswrapper[4618]: I0121 09:18:56.853303 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.074787261 podStartE2EDuration="5.853277431s" podCreationTimestamp="2026-01-21 09:18:51 +0000 UTC" firstStartedPulling="2026-01-21 09:18:52.757557095 +0000 UTC m=+931.508024412" lastFinishedPulling="2026-01-21 09:18:56.536047266 +0000 UTC m=+935.286514582" observedRunningTime="2026-01-21 09:18:56.835874062 +0000 UTC m=+935.586341379" watchObservedRunningTime="2026-01-21 09:18:56.853277431 +0000 UTC m=+935.603744748" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.215203 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.315394 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs\") pod \"048b9318-e305-40c3-86e9-9081b01ca1cb\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.315716 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data\") pod \"048b9318-e305-40c3-86e9-9081b01ca1cb\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.315746 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs" (OuterVolumeSpecName: "logs") pod "048b9318-e305-40c3-86e9-9081b01ca1cb" (UID: "048b9318-e305-40c3-86e9-9081b01ca1cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.315829 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts\") pod \"048b9318-e305-40c3-86e9-9081b01ca1cb\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.315998 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key\") pod \"048b9318-e305-40c3-86e9-9081b01ca1cb\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.316050 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csv6b\" (UniqueName: \"kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b\") pod \"048b9318-e305-40c3-86e9-9081b01ca1cb\" (UID: \"048b9318-e305-40c3-86e9-9081b01ca1cb\") " Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.316749 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/048b9318-e305-40c3-86e9-9081b01ca1cb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.321768 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b" (OuterVolumeSpecName: "kube-api-access-csv6b") pod "048b9318-e305-40c3-86e9-9081b01ca1cb" (UID: "048b9318-e305-40c3-86e9-9081b01ca1cb"). InnerVolumeSpecName "kube-api-access-csv6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.322646 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "048b9318-e305-40c3-86e9-9081b01ca1cb" (UID: "048b9318-e305-40c3-86e9-9081b01ca1cb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.347953 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data" (OuterVolumeSpecName: "config-data") pod "048b9318-e305-40c3-86e9-9081b01ca1cb" (UID: "048b9318-e305-40c3-86e9-9081b01ca1cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.357030 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts" (OuterVolumeSpecName: "scripts") pod "048b9318-e305-40c3-86e9-9081b01ca1cb" (UID: "048b9318-e305-40c3-86e9-9081b01ca1cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.417603 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.417628 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/048b9318-e305-40c3-86e9-9081b01ca1cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.417637 4618 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/048b9318-e305-40c3-86e9-9081b01ca1cb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.417650 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csv6b\" (UniqueName: \"kubernetes.io/projected/048b9318-e305-40c3-86e9-9081b01ca1cb-kube-api-access-csv6b\") on node \"crc\" DevicePath \"\"" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.602862 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.624987 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68d7cbc6d4-mthph" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.689047 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.690256 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" containerID="cri-o://150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6" gracePeriod=30 Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.690413 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" containerID="cri-o://266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6" gracePeriod=30 Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.699372 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.699624 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.833552 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3c710262-7141-4edf-8f70-b5ee3d235970","Type":"ContainerStarted","Data":"1f6b11bdbc6131fdf6dbb0d9551695dd332b4d42e93ea0458f373c986e84b13b"} Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.835661 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.846829 4618 generic.go:334] "Generic (PLEG): container finished" podID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerID="49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7" exitCode=0 Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.847000 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerDied","Data":"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7"} Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.863652 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f546b547-ct97b" event={"ID":"048b9318-e305-40c3-86e9-9081b01ca1cb","Type":"ContainerDied","Data":"270c42f144481e2add4117f12cf94d8d3161bad95d093f828794b1d1a23a54a8"} Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.863740 4618 scope.go:117] "RemoveContainer" containerID="254815b64f63b11b08029d4b7856eadb7d21d7ec68730bdabed22c3ab54370a8" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.863932 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f546b547-ct97b" Jan 21 09:18:57 crc kubenswrapper[4618]: I0121 09:18:57.864537 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.864521507 podStartE2EDuration="3.864521507s" podCreationTimestamp="2026-01-21 09:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:18:57.856689298 +0000 UTC m=+936.607156615" watchObservedRunningTime="2026-01-21 09:18:57.864521507 +0000 UTC m=+936.614988824" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.018629 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.026389 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9f546b547-ct97b"] Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.162236 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.167685 4618 scope.go:117] "RemoveContainer" containerID="c3ee741a332d0d27e315f8c53be9b88fa52fc79bc463094f0be8a9a5b958987c" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.257579 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.542750 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.542819 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.599243 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.873041 4618 generic.go:334] "Generic (PLEG): container finished" podID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerID="8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074" exitCode=0 Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.873126 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerDied","Data":"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074"} Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.874965 4618 generic.go:334] "Generic (PLEG): container finished" podID="01d5744a-4aa4-403e-9051-a4764b44304c" containerID="150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6" exitCode=143 Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.875029 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerDied","Data":"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6"} Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.877646 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerStarted","Data":"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18"} Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.929668 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tldbm" podStartSLOduration=3.141787786 podStartE2EDuration="5.929650812s" podCreationTimestamp="2026-01-21 09:18:53 +0000 UTC" firstStartedPulling="2026-01-21 09:18:55.730468832 +0000 UTC m=+934.480936150" lastFinishedPulling="2026-01-21 09:18:58.518331859 +0000 UTC m=+937.268799176" observedRunningTime="2026-01-21 09:18:58.926298167 +0000 UTC m=+937.676765485" watchObservedRunningTime="2026-01-21 09:18:58.929650812 +0000 UTC m=+937.680118129" Jan 21 09:18:58 crc kubenswrapper[4618]: I0121 09:18:58.939966 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.386180 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.449612 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.486260 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.541323 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.541911 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-685444497c-gxfjk" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="dnsmasq-dns" containerID="cri-o://479cdd8d27d04a95f05ec290ac3c82fa6e012e0fa248631ecb0cf311e78bc141" gracePeriod=10 Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.565862 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" path="/var/lib/kubelet/pods/048b9318-e305-40c3-86e9-9081b01ca1cb/volumes" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.572639 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.888929 4618 generic.go:334] "Generic (PLEG): container finished" podID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerID="479cdd8d27d04a95f05ec290ac3c82fa6e012e0fa248631ecb0cf311e78bc141" exitCode=0 Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.889802 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-gxfjk" event={"ID":"3f873699-80a9-4e80-93fc-7c99e5cd5e69","Type":"ContainerDied","Data":"479cdd8d27d04a95f05ec290ac3c82fa6e012e0fa248631ecb0cf311e78bc141"} Jan 21 09:18:59 crc kubenswrapper[4618]: I0121 09:18:59.935313 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.003548 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085355 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085473 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085559 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx2mj\" (UniqueName: \"kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085673 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085825 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.085881 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb\") pod \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\" (UID: \"3f873699-80a9-4e80-93fc-7c99e5cd5e69\") " Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.115515 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj" (OuterVolumeSpecName: "kube-api-access-hx2mj") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "kube-api-access-hx2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.150594 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.154539 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.163689 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config" (OuterVolumeSpecName: "config") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.168496 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.179751 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f873699-80a9-4e80-93fc-7c99e5cd5e69" (UID: "3f873699-80a9-4e80-93fc-7c99e5cd5e69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.188936 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.189023 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.189088 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx2mj\" (UniqueName: \"kubernetes.io/projected/3f873699-80a9-4e80-93fc-7c99e5cd5e69-kube-api-access-hx2mj\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.189158 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.189212 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.189268 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f873699-80a9-4e80-93fc-7c99e5cd5e69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.501857 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.502276 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sdwbk" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" containerID="cri-o://55c53d9ae857fa8bfcc058674e7b2eb54bb50d4f6d172488443ab4da2fbbb02d" gracePeriod=2 Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.900902 4618 generic.go:334] "Generic (PLEG): container finished" podID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerID="55c53d9ae857fa8bfcc058674e7b2eb54bb50d4f6d172488443ab4da2fbbb02d" exitCode=0 Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.900967 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerDied","Data":"55c53d9ae857fa8bfcc058674e7b2eb54bb50d4f6d172488443ab4da2fbbb02d"} Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.901292 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdwbk" event={"ID":"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8","Type":"ContainerDied","Data":"aa5dbcaca625a69bacdaaa6bd89b152053cff05e906cd3676aa7b40e12f5e61a"} Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.901311 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5dbcaca625a69bacdaaa6bd89b152053cff05e906cd3676aa7b40e12f5e61a" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.903613 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="cinder-scheduler" containerID="cri-o://f4aa7fb21018b58c5b7bdea2e2db928579996999c8ff9bf75e776ee65541237c" gracePeriod=30 Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.904010 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-gxfjk" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.905908 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-gxfjk" event={"ID":"3f873699-80a9-4e80-93fc-7c99e5cd5e69","Type":"ContainerDied","Data":"bfc57341a3870e1c06026b9b03a9a323dd388bb5403bb4f4af7ebfbfed0da7bf"} Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.905983 4618 scope.go:117] "RemoveContainer" containerID="479cdd8d27d04a95f05ec290ac3c82fa6e012e0fa248631ecb0cf311e78bc141" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.905958 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="probe" containerID="cri-o://23a78fba194b02a5e7856f89511758e90d608bb6a0965f2bf6a1fce3f00763f6" gracePeriod=30 Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.946719 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.962797 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.967392 4618 scope.go:117] "RemoveContainer" containerID="3fb8a61031f1cac97760e25ad51f629e6a8d14a1fd08eec903066ea1cfd7ec94" Jan 21 09:19:00 crc kubenswrapper[4618]: I0121 09:19:00.982583 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685444497c-gxfjk"] Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.009753 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5z9c\" (UniqueName: \"kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c\") pod \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.009954 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content\") pod \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.010121 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities\") pod \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\" (UID: \"9634d7b6-05f5-408f-a6a5-01aa17d9bfb8\") " Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.011608 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities" (OuterVolumeSpecName: "utilities") pod "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" (UID: "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.015553 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c" (OuterVolumeSpecName: "kube-api-access-c5z9c") pod "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" (UID: "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8"). InnerVolumeSpecName "kube-api-access-c5z9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.105550 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" (UID: "9634d7b6-05f5-408f-a6a5-01aa17d9bfb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.113345 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5z9c\" (UniqueName: \"kubernetes.io/projected/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-kube-api-access-c5z9c\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.113500 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.113563 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.549915 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" path="/var/lib/kubelet/pods/3f873699-80a9-4e80-93fc-7c99e5cd5e69/volumes" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.899843 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.904033 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4sxs" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="registry-server" containerID="cri-o://204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be" gracePeriod=2 Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.916848 4618 generic.go:334] "Generic (PLEG): container finished" podID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerID="23a78fba194b02a5e7856f89511758e90d608bb6a0965f2bf6a1fce3f00763f6" exitCode=0 Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.916942 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerDied","Data":"23a78fba194b02a5e7856f89511758e90d608bb6a0965f2bf6a1fce3f00763f6"} Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.919508 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdwbk" Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.950377 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:19:01 crc kubenswrapper[4618]: I0121 09:19:01.969125 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sdwbk"] Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.407125 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.538557 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content\") pod \"b69aa60f-666b-4605-b06b-debf9ef07d48\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.538996 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities\") pod \"b69aa60f-666b-4605-b06b-debf9ef07d48\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.539044 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4279m\" (UniqueName: \"kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m\") pod \"b69aa60f-666b-4605-b06b-debf9ef07d48\" (UID: \"b69aa60f-666b-4605-b06b-debf9ef07d48\") " Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.539537 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities" (OuterVolumeSpecName: "utilities") pod "b69aa60f-666b-4605-b06b-debf9ef07d48" (UID: "b69aa60f-666b-4605-b06b-debf9ef07d48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.545675 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m" (OuterVolumeSpecName: "kube-api-access-4279m") pod "b69aa60f-666b-4605-b06b-debf9ef07d48" (UID: "b69aa60f-666b-4605-b06b-debf9ef07d48"). InnerVolumeSpecName "kube-api-access-4279m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.572275 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b69aa60f-666b-4605-b06b-debf9ef07d48" (UID: "b69aa60f-666b-4605-b06b-debf9ef07d48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.641254 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.641372 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69aa60f-666b-4605-b06b-debf9ef07d48-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.641430 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4279m\" (UniqueName: \"kubernetes.io/projected/b69aa60f-666b-4605-b06b-debf9ef07d48-kube-api-access-4279m\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.932059 4618 generic.go:334] "Generic (PLEG): container finished" podID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerID="204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be" exitCode=0 Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.932116 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerDied","Data":"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be"} Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.932181 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4sxs" event={"ID":"b69aa60f-666b-4605-b06b-debf9ef07d48","Type":"ContainerDied","Data":"073a7bad8e185d95f24ac0cad0c1b61f1b06f9a2bb8e40702b0b1359ef1f995d"} Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.932208 4618 scope.go:117] "RemoveContainer" containerID="204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.932364 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4sxs" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.974013 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.974046 4618 scope.go:117] "RemoveContainer" containerID="ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc" Jan 21 09:19:02 crc kubenswrapper[4618]: I0121 09:19:02.986882 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4sxs"] Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.000351 4618 scope.go:117] "RemoveContainer" containerID="4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.028384 4618 scope.go:117] "RemoveContainer" containerID="204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be" Jan 21 09:19:03 crc kubenswrapper[4618]: E0121 09:19:03.028696 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be\": container with ID starting with 204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be not found: ID does not exist" containerID="204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.028737 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be"} err="failed to get container status \"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be\": rpc error: code = NotFound desc = could not find container \"204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be\": container with ID starting with 204f268ba95da7456c2e5a9152269cf874ccb6744a51adf463fad2202a25b7be not found: ID does not exist" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.028764 4618 scope.go:117] "RemoveContainer" containerID="ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc" Jan 21 09:19:03 crc kubenswrapper[4618]: E0121 09:19:03.029037 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc\": container with ID starting with ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc not found: ID does not exist" containerID="ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.029062 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc"} err="failed to get container status \"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc\": rpc error: code = NotFound desc = could not find container \"ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc\": container with ID starting with ff50903d076fdb263a0a202b9509194ae01f87e677c81c7b0ddfd73df397d5fc not found: ID does not exist" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.029076 4618 scope.go:117] "RemoveContainer" containerID="4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804" Jan 21 09:19:03 crc kubenswrapper[4618]: E0121 09:19:03.029431 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804\": container with ID starting with 4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804 not found: ID does not exist" containerID="4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.029450 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804"} err="failed to get container status \"4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804\": rpc error: code = NotFound desc = could not find container \"4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804\": container with ID starting with 4bb6b1672d879303367f2d2e462aead903bc62a9bdf93ed96c949172991a8804 not found: ID does not exist" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.132003 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37974->10.217.0.161:9311: read: connection reset by peer" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.131997 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37958->10.217.0.161:9311: read: connection reset by peer" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.519363 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.547707 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" path="/var/lib/kubelet/pods/9634d7b6-05f5-408f-a6a5-01aa17d9bfb8/volumes" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.548467 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" path="/var/lib/kubelet/pods/b69aa60f-666b-4605-b06b-debf9ef07d48/volumes" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.667985 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle\") pod \"01d5744a-4aa4-403e-9051-a4764b44304c\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.668076 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom\") pod \"01d5744a-4aa4-403e-9051-a4764b44304c\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.668179 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs\") pod \"01d5744a-4aa4-403e-9051-a4764b44304c\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.668211 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqltf\" (UniqueName: \"kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf\") pod \"01d5744a-4aa4-403e-9051-a4764b44304c\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.668312 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data\") pod \"01d5744a-4aa4-403e-9051-a4764b44304c\" (UID: \"01d5744a-4aa4-403e-9051-a4764b44304c\") " Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.668604 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs" (OuterVolumeSpecName: "logs") pod "01d5744a-4aa4-403e-9051-a4764b44304c" (UID: "01d5744a-4aa4-403e-9051-a4764b44304c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.669116 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d5744a-4aa4-403e-9051-a4764b44304c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.673249 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "01d5744a-4aa4-403e-9051-a4764b44304c" (UID: "01d5744a-4aa4-403e-9051-a4764b44304c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.674201 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf" (OuterVolumeSpecName: "kube-api-access-sqltf") pod "01d5744a-4aa4-403e-9051-a4764b44304c" (UID: "01d5744a-4aa4-403e-9051-a4764b44304c"). InnerVolumeSpecName "kube-api-access-sqltf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.691889 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d5744a-4aa4-403e-9051-a4764b44304c" (UID: "01d5744a-4aa4-403e-9051-a4764b44304c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.707382 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data" (OuterVolumeSpecName: "config-data") pod "01d5744a-4aa4-403e-9051-a4764b44304c" (UID: "01d5744a-4aa4-403e-9051-a4764b44304c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.771205 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.771236 4618 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.771247 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqltf\" (UniqueName: \"kubernetes.io/projected/01d5744a-4aa4-403e-9051-a4764b44304c-kube-api-access-sqltf\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.771257 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d5744a-4aa4-403e-9051-a4764b44304c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.943272 4618 generic.go:334] "Generic (PLEG): container finished" podID="01d5744a-4aa4-403e-9051-a4764b44304c" containerID="266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6" exitCode=0 Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.943372 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerDied","Data":"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6"} Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.943432 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d46c4985b-6wf5b" event={"ID":"01d5744a-4aa4-403e-9051-a4764b44304c","Type":"ContainerDied","Data":"51371dacd2de0af42c1623072961a550b90ae8e346c18b5da0b598511fdba2cc"} Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.943452 4618 scope.go:117] "RemoveContainer" containerID="266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.944193 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d46c4985b-6wf5b" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.982311 4618 scope.go:117] "RemoveContainer" containerID="150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6" Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.982329 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:19:03 crc kubenswrapper[4618]: I0121 09:19:03.986253 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6d46c4985b-6wf5b"] Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.029018 4618 scope.go:117] "RemoveContainer" containerID="266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6" Jan 21 09:19:04 crc kubenswrapper[4618]: E0121 09:19:04.029421 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6\": container with ID starting with 266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6 not found: ID does not exist" containerID="266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.029465 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6"} err="failed to get container status \"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6\": rpc error: code = NotFound desc = could not find container \"266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6\": container with ID starting with 266d8bf3f8dd4775b750e1edf60d4b32571db00aeca2a78f52e7f6cb244bb8c6 not found: ID does not exist" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.029494 4618 scope.go:117] "RemoveContainer" containerID="150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6" Jan 21 09:19:04 crc kubenswrapper[4618]: E0121 09:19:04.029906 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6\": container with ID starting with 150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6 not found: ID does not exist" containerID="150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.029951 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6"} err="failed to get container status \"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6\": rpc error: code = NotFound desc = could not find container \"150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6\": container with ID starting with 150229d16f995643fb849ddb8f1a9ded2d3001bef4b77f648a08c7af8e9a9bb6 not found: ID does not exist" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.098948 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.099026 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.140815 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.792117 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.902800 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config\") pod \"7b0e8004-6943-476d-9fab-36846adbc5de\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.902893 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config\") pod \"7b0e8004-6943-476d-9fab-36846adbc5de\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.903111 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8th6s\" (UniqueName: \"kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s\") pod \"7b0e8004-6943-476d-9fab-36846adbc5de\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.903270 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle\") pod \"7b0e8004-6943-476d-9fab-36846adbc5de\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.903431 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs\") pod \"7b0e8004-6943-476d-9fab-36846adbc5de\" (UID: \"7b0e8004-6943-476d-9fab-36846adbc5de\") " Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.914101 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7b0e8004-6943-476d-9fab-36846adbc5de" (UID: "7b0e8004-6943-476d-9fab-36846adbc5de"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.917876 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s" (OuterVolumeSpecName: "kube-api-access-8th6s") pod "7b0e8004-6943-476d-9fab-36846adbc5de" (UID: "7b0e8004-6943-476d-9fab-36846adbc5de"). InnerVolumeSpecName "kube-api-access-8th6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.942195 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b0e8004-6943-476d-9fab-36846adbc5de" (UID: "7b0e8004-6943-476d-9fab-36846adbc5de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.945092 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config" (OuterVolumeSpecName: "config") pod "7b0e8004-6943-476d-9fab-36846adbc5de" (UID: "7b0e8004-6943-476d-9fab-36846adbc5de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.975690 4618 generic.go:334] "Generic (PLEG): container finished" podID="7b0e8004-6943-476d-9fab-36846adbc5de" containerID="984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de" exitCode=0 Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.975796 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerDied","Data":"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de"} Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.976185 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69f66b98c6-zmvmx" event={"ID":"7b0e8004-6943-476d-9fab-36846adbc5de","Type":"ContainerDied","Data":"f8ea2c75a8552bd2d6bd9dbc3fb31d0e4c4d110c23118aaa6b014ebca533d7c0"} Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.976215 4618 scope.go:117] "RemoveContainer" containerID="617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.975850 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69f66b98c6-zmvmx" Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.978873 4618 generic.go:334] "Generic (PLEG): container finished" podID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerID="f4aa7fb21018b58c5b7bdea2e2db928579996999c8ff9bf75e776ee65541237c" exitCode=0 Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.978936 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerDied","Data":"f4aa7fb21018b58c5b7bdea2e2db928579996999c8ff9bf75e776ee65541237c"} Jan 21 09:19:04 crc kubenswrapper[4618]: I0121 09:19:04.989201 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7b0e8004-6943-476d-9fab-36846adbc5de" (UID: "7b0e8004-6943-476d-9fab-36846adbc5de"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.005484 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8th6s\" (UniqueName: \"kubernetes.io/projected/7b0e8004-6943-476d-9fab-36846adbc5de-kube-api-access-8th6s\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.005515 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.005532 4618 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.005542 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.005556 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b0e8004-6943-476d-9fab-36846adbc5de-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.018254 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.061299 4618 scope.go:117] "RemoveContainer" containerID="984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.066441 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.089068 4618 scope.go:117] "RemoveContainer" containerID="617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9" Jan 21 09:19:05 crc kubenswrapper[4618]: E0121 09:19:05.090394 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9\": container with ID starting with 617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9 not found: ID does not exist" containerID="617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.090435 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9"} err="failed to get container status \"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9\": rpc error: code = NotFound desc = could not find container \"617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9\": container with ID starting with 617b692bac72b3709ce537d23cd5d8374318ea8f239a1088d0cbd92b90884ac9 not found: ID does not exist" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.090461 4618 scope.go:117] "RemoveContainer" containerID="984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de" Jan 21 09:19:05 crc kubenswrapper[4618]: E0121 09:19:05.090847 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de\": container with ID starting with 984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de not found: ID does not exist" containerID="984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.090878 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de"} err="failed to get container status \"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de\": rpc error: code = NotFound desc = could not find container \"984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de\": container with ID starting with 984eee65e3ed59180030ce3c7ddcabba5952a8b964632eb0b2915fe04416f4de not found: ID does not exist" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209503 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209585 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209731 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209807 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209834 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.209931 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvfdf\" (UniqueName: \"kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf\") pod \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\" (UID: \"2cc678ed-dd0c-4b8b-b857-51e5c128f870\") " Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.210269 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.210708 4618 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cc678ed-dd0c-4b8b-b857-51e5c128f870-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.216795 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf" (OuterVolumeSpecName: "kube-api-access-rvfdf") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "kube-api-access-rvfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.216911 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.218532 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts" (OuterVolumeSpecName: "scripts") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.256652 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.289125 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data" (OuterVolumeSpecName: "config-data") pod "2cc678ed-dd0c-4b8b-b857-51e5c128f870" (UID: "2cc678ed-dd0c-4b8b-b857-51e5c128f870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.312819 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.312866 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.312878 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.312888 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvfdf\" (UniqueName: \"kubernetes.io/projected/2cc678ed-dd0c-4b8b-b857-51e5c128f870-kube-api-access-rvfdf\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.312901 4618 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc678ed-dd0c-4b8b-b857-51e5c128f870-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.340183 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.349125 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69f66b98c6-zmvmx"] Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.551062 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" path="/var/lib/kubelet/pods/01d5744a-4aa4-403e-9051-a4764b44304c/volumes" Jan 21 09:19:05 crc kubenswrapper[4618]: I0121 09:19:05.551822 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" path="/var/lib/kubelet/pods/7b0e8004-6943-476d-9fab-36846adbc5de/volumes" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.005092 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.005593 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cc678ed-dd0c-4b8b-b857-51e5c128f870","Type":"ContainerDied","Data":"dfadf146e354a227a3f3c2b54749e1867c345bb2bed592d53106a4de3e77841e"} Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.005637 4618 scope.go:117] "RemoveContainer" containerID="23a78fba194b02a5e7856f89511758e90d608bb6a0965f2bf6a1fce3f00763f6" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.033521 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.041667 4618 scope.go:117] "RemoveContainer" containerID="f4aa7fb21018b58c5b7bdea2e2db928579996999c8ff9bf75e776ee65541237c" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.044220 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.056179 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.059948 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="probe" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.059977 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="probe" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.059988 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="init" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.059995 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="init" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060008 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon-log" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060015 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon-log" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060028 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="cinder-scheduler" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060034 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="cinder-scheduler" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060046 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060051 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060058 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="extract-utilities" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060064 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="extract-utilities" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060075 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-httpd" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060080 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-httpd" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060098 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060105 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060112 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060117 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060127 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060133 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060156 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="extract-content" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060162 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="extract-content" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060172 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060177 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060198 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="extract-content" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060204 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="extract-content" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060218 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="dnsmasq-dns" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060224 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="dnsmasq-dns" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060231 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-api" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060236 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-api" Jan 21 09:19:06 crc kubenswrapper[4618]: E0121 09:19:06.060243 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="extract-utilities" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060248 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="extract-utilities" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060419 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9634d7b6-05f5-408f-a6a5-01aa17d9bfb8" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060434 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f873699-80a9-4e80-93fc-7c99e5cd5e69" containerName="dnsmasq-dns" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060441 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon-log" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060451 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="probe" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060458 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060465 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-api" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060473 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060483 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="048b9318-e305-40c3-86e9-9081b01ca1cb" containerName="horizon" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060494 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0e8004-6943-476d-9fab-36846adbc5de" containerName="neutron-httpd" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060503 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69aa60f-666b-4605-b06b-debf9ef07d48" containerName="registry-server" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.060513 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" containerName="cinder-scheduler" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.061940 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.065584 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.066842 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.132197 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.133006 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz45z\" (UniqueName: \"kubernetes.io/projected/def78b06-bd3c-4722-82a7-15b80abe36fe-kube-api-access-nz45z\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.133043 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-scripts\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.133234 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/def78b06-bd3c-4722-82a7-15b80abe36fe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.133254 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.133349 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235203 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235310 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz45z\" (UniqueName: \"kubernetes.io/projected/def78b06-bd3c-4722-82a7-15b80abe36fe-kube-api-access-nz45z\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235433 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-scripts\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235558 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/def78b06-bd3c-4722-82a7-15b80abe36fe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235581 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.235698 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/def78b06-bd3c-4722-82a7-15b80abe36fe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.241346 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-scripts\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.241766 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.242155 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.242796 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def78b06-bd3c-4722-82a7-15b80abe36fe-config-data\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.262516 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz45z\" (UniqueName: \"kubernetes.io/projected/def78b06-bd3c-4722-82a7-15b80abe36fe-kube-api-access-nz45z\") pod \"cinder-scheduler-0\" (UID: \"def78b06-bd3c-4722-82a7-15b80abe36fe\") " pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.383000 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.714599 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.891781 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:19:06 crc kubenswrapper[4618]: I0121 09:19:06.928296 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 09:19:06 crc kubenswrapper[4618]: W0121 09:19:06.934700 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddef78b06_bd3c_4722_82a7_15b80abe36fe.slice/crio-4f112c5338e0d86707f065411f144b15d0dfe6402b8c63a90db91c9cd3305d9b WatchSource:0}: Error finding container 4f112c5338e0d86707f065411f144b15d0dfe6402b8c63a90db91c9cd3305d9b: Status 404 returned error can't find the container with id 4f112c5338e0d86707f065411f144b15d0dfe6402b8c63a90db91c9cd3305d9b Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.023301 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"def78b06-bd3c-4722-82a7-15b80abe36fe","Type":"ContainerStarted","Data":"4f112c5338e0d86707f065411f144b15d0dfe6402b8c63a90db91c9cd3305d9b"} Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.028369 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tldbm" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="registry-server" containerID="cri-o://4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18" gracePeriod=2 Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.516428 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.574897 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc678ed-dd0c-4b8b-b857-51e5c128f870" path="/var/lib/kubelet/pods/2cc678ed-dd0c-4b8b-b857-51e5c128f870/volumes" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.677172 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48mph\" (UniqueName: \"kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph\") pod \"eb33f26a-71d2-4907-a3c3-f57202c99176\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.677317 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content\") pod \"eb33f26a-71d2-4907-a3c3-f57202c99176\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.677371 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities\") pod \"eb33f26a-71d2-4907-a3c3-f57202c99176\" (UID: \"eb33f26a-71d2-4907-a3c3-f57202c99176\") " Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.678136 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities" (OuterVolumeSpecName: "utilities") pod "eb33f26a-71d2-4907-a3c3-f57202c99176" (UID: "eb33f26a-71d2-4907-a3c3-f57202c99176"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.682411 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph" (OuterVolumeSpecName: "kube-api-access-48mph") pod "eb33f26a-71d2-4907-a3c3-f57202c99176" (UID: "eb33f26a-71d2-4907-a3c3-f57202c99176"). InnerVolumeSpecName "kube-api-access-48mph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.731669 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb33f26a-71d2-4907-a3c3-f57202c99176" (UID: "eb33f26a-71d2-4907-a3c3-f57202c99176"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.780238 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48mph\" (UniqueName: \"kubernetes.io/projected/eb33f26a-71d2-4907-a3c3-f57202c99176-kube-api-access-48mph\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.780279 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:07 crc kubenswrapper[4618]: I0121 09:19:07.780291 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb33f26a-71d2-4907-a3c3-f57202c99176-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.063456 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"def78b06-bd3c-4722-82a7-15b80abe36fe","Type":"ContainerStarted","Data":"f00564e929622a078eadcca7a0319ac5dc733e7634773c7169320a25af64d5b6"} Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.063790 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"def78b06-bd3c-4722-82a7-15b80abe36fe","Type":"ContainerStarted","Data":"9a7ec4fb2ed6069b4d3aa08925da1f3f343afb7215c64ff90b3f351c3690c2b5"} Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.067272 4618 generic.go:334] "Generic (PLEG): container finished" podID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerID="4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18" exitCode=0 Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.067303 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerDied","Data":"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18"} Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.067330 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tldbm" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.067361 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tldbm" event={"ID":"eb33f26a-71d2-4907-a3c3-f57202c99176","Type":"ContainerDied","Data":"39d9984078e6869c34ec24cfc53701d675ee1a51d36f0c2b549422863760d6d3"} Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.067417 4618 scope.go:117] "RemoveContainer" containerID="4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.087407 4618 scope.go:117] "RemoveContainer" containerID="49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.091125 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.091113757 podStartE2EDuration="2.091113757s" podCreationTimestamp="2026-01-21 09:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:19:08.083514786 +0000 UTC m=+946.833982103" watchObservedRunningTime="2026-01-21 09:19:08.091113757 +0000 UTC m=+946.841581074" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.108061 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.113348 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tldbm"] Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.141197 4618 scope.go:117] "RemoveContainer" containerID="f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.180237 4618 scope.go:117] "RemoveContainer" containerID="4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18" Jan 21 09:19:08 crc kubenswrapper[4618]: E0121 09:19:08.180726 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18\": container with ID starting with 4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18 not found: ID does not exist" containerID="4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.180759 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18"} err="failed to get container status \"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18\": rpc error: code = NotFound desc = could not find container \"4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18\": container with ID starting with 4b74c399d20c886eac5e74d82fa6a3ecd681b64e678020767f20f48f31bedd18 not found: ID does not exist" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.180780 4618 scope.go:117] "RemoveContainer" containerID="49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7" Jan 21 09:19:08 crc kubenswrapper[4618]: E0121 09:19:08.181059 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7\": container with ID starting with 49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7 not found: ID does not exist" containerID="49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.181081 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7"} err="failed to get container status \"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7\": rpc error: code = NotFound desc = could not find container \"49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7\": container with ID starting with 49083b25d87030b42a0a59b9f1c6d77daceb76393bf1bfa897909c3f9da962d7 not found: ID does not exist" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.181095 4618 scope.go:117] "RemoveContainer" containerID="f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c" Jan 21 09:19:08 crc kubenswrapper[4618]: E0121 09:19:08.181362 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c\": container with ID starting with f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c not found: ID does not exist" containerID="f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.181383 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c"} err="failed to get container status \"f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c\": rpc error: code = NotFound desc = could not find container \"f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c\": container with ID starting with f9aac1d3653198ace0725d9acdfed219f6657d9a047a02da5908a9e501c5253c not found: ID does not exist" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.383313 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 09:19:08 crc kubenswrapper[4618]: I0121 09:19:08.383340 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d46c4985b-6wf5b" podUID="01d5744a-4aa4-403e-9051-a4764b44304c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 09:19:09 crc kubenswrapper[4618]: I0121 09:19:09.449834 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 21 09:19:09 crc kubenswrapper[4618]: I0121 09:19:09.548108 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" path="/var/lib/kubelet/pods/eb33f26a-71d2-4907-a3c3-f57202c99176/volumes" Jan 21 09:19:09 crc kubenswrapper[4618]: I0121 09:19:09.665696 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:19:09 crc kubenswrapper[4618]: I0121 09:19:09.669468 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54d488db9b-swfld" Jan 21 09:19:10 crc kubenswrapper[4618]: I0121 09:19:10.032540 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76759dfdcd-gxbvm" Jan 21 09:19:11 crc kubenswrapper[4618]: I0121 09:19:11.383404 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.276596 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 21 09:19:13 crc kubenswrapper[4618]: E0121 09:19:13.277438 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="registry-server" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.277455 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="registry-server" Jan 21 09:19:13 crc kubenswrapper[4618]: E0121 09:19:13.277488 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="extract-content" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.277495 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="extract-content" Jan 21 09:19:13 crc kubenswrapper[4618]: E0121 09:19:13.277513 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="extract-utilities" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.277520 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="extract-utilities" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.278037 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb33f26a-71d2-4907-a3c3-f57202c99176" containerName="registry-server" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.278753 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.280566 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.281079 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4zsr9" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.281357 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.283513 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.396689 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.396944 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc6k8\" (UniqueName: \"kubernetes.io/projected/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-kube-api-access-sc6k8\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.397125 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.397309 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.499729 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.499786 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc6k8\" (UniqueName: \"kubernetes.io/projected/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-kube-api-access-sc6k8\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.499842 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.499868 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.500852 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.510623 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.511455 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.526118 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc6k8\" (UniqueName: \"kubernetes.io/projected/6039b2d9-1ca5-480a-a1a4-f5ec50e082aa-kube-api-access-sc6k8\") pod \"openstackclient\" (UID: \"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa\") " pod="openstack/openstackclient" Jan 21 09:19:13 crc kubenswrapper[4618]: I0121 09:19:13.608666 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 09:19:14 crc kubenswrapper[4618]: I0121 09:19:14.074327 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 09:19:14 crc kubenswrapper[4618]: W0121 09:19:14.076086 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6039b2d9_1ca5_480a_a1a4_f5ec50e082aa.slice/crio-785a311f2bbbd18aa5df2029adb92a08244f957114b3dc3af30172bc2cbc1ed0 WatchSource:0}: Error finding container 785a311f2bbbd18aa5df2029adb92a08244f957114b3dc3af30172bc2cbc1ed0: Status 404 returned error can't find the container with id 785a311f2bbbd18aa5df2029adb92a08244f957114b3dc3af30172bc2cbc1ed0 Jan 21 09:19:14 crc kubenswrapper[4618]: I0121 09:19:14.127837 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa","Type":"ContainerStarted","Data":"785a311f2bbbd18aa5df2029adb92a08244f957114b3dc3af30172bc2cbc1ed0"} Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.293888 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d5bd5664f-ncbh6"] Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.295572 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.298988 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.299175 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.299747 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.306732 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d5bd5664f-ncbh6"] Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.461691 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-log-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462040 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-run-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462094 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-etc-swift\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462119 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-internal-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462176 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-combined-ca-bundle\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462270 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-config-data\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462336 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-public-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.462380 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsx4\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-kube-api-access-thsx4\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.563907 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsx4\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-kube-api-access-thsx4\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.563975 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-log-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564022 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-run-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564047 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-etc-swift\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564076 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-internal-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564098 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-combined-ca-bundle\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564172 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-config-data\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564237 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-public-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564522 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-run-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.564566 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcbc9a4-5180-4530-8003-a54391ebbd6c-log-httpd\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.570538 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-public-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.572822 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-config-data\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.574573 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-combined-ca-bundle\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.577480 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-etc-swift\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.581772 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsx4\" (UniqueName: \"kubernetes.io/projected/3fcbc9a4-5180-4530-8003-a54391ebbd6c-kube-api-access-thsx4\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.582527 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcbc9a4-5180-4530-8003-a54391ebbd6c-internal-tls-certs\") pod \"swift-proxy-d5bd5664f-ncbh6\" (UID: \"3fcbc9a4-5180-4530-8003-a54391ebbd6c\") " pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.612786 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:16 crc kubenswrapper[4618]: I0121 09:19:16.615792 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 09:19:17 crc kubenswrapper[4618]: I0121 09:19:17.226781 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d5bd5664f-ncbh6"] Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.162338 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d5bd5664f-ncbh6" event={"ID":"3fcbc9a4-5180-4530-8003-a54391ebbd6c","Type":"ContainerStarted","Data":"4d2ebd00d8883d2c7fec00dc3d1b7467af00aedef1ce1b2fc107a0c302004213"} Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.162768 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.162782 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d5bd5664f-ncbh6" event={"ID":"3fcbc9a4-5180-4530-8003-a54391ebbd6c","Type":"ContainerStarted","Data":"c14d2854c3c8056c1b7243d9d8608df50534325f15865fdb80e3af329cf1a168"} Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.162794 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d5bd5664f-ncbh6" event={"ID":"3fcbc9a4-5180-4530-8003-a54391ebbd6c","Type":"ContainerStarted","Data":"0ed656b2df5b5e13cc20ad45c28c7c76fa1af784fcf557a38beeec4a2fd72c39"} Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.178910 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d5bd5664f-ncbh6" podStartSLOduration=2.178896495 podStartE2EDuration="2.178896495s" podCreationTimestamp="2026-01-21 09:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:19:18.175480702 +0000 UTC m=+956.925948019" watchObservedRunningTime="2026-01-21 09:19:18.178896495 +0000 UTC m=+956.929363803" Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.196313 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.196587 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-central-agent" containerID="cri-o://0d0c716bc5e69ae3e0aa858cfc309ae774ec3fab70e4cf1e0841d7e8826bb0bd" gracePeriod=30 Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.196661 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="sg-core" containerID="cri-o://a2825559adc6707e7f5bb3329894c71e58af4f56342f9e8f13341a4cfc919cc1" gracePeriod=30 Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.196697 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-notification-agent" containerID="cri-o://baa19d216905e3374a20be89dd594efda419f21f7c345ef9ec8687bfc85c6046" gracePeriod=30 Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.196760 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="proxy-httpd" containerID="cri-o://ac778dbfaff7210fda0908dbfdb4dfbf29e869c826ae63523e69c48be89afeb5" gracePeriod=30 Jan 21 09:19:18 crc kubenswrapper[4618]: I0121 09:19:18.201702 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.179966 4618 generic.go:334] "Generic (PLEG): container finished" podID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerID="ac778dbfaff7210fda0908dbfdb4dfbf29e869c826ae63523e69c48be89afeb5" exitCode=0 Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180288 4618 generic.go:334] "Generic (PLEG): container finished" podID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerID="a2825559adc6707e7f5bb3329894c71e58af4f56342f9e8f13341a4cfc919cc1" exitCode=2 Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180298 4618 generic.go:334] "Generic (PLEG): container finished" podID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerID="0d0c716bc5e69ae3e0aa858cfc309ae774ec3fab70e4cf1e0841d7e8826bb0bd" exitCode=0 Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180012 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerDied","Data":"ac778dbfaff7210fda0908dbfdb4dfbf29e869c826ae63523e69c48be89afeb5"} Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180402 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerDied","Data":"a2825559adc6707e7f5bb3329894c71e58af4f56342f9e8f13341a4cfc919cc1"} Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180683 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.180703 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerDied","Data":"0d0c716bc5e69ae3e0aa858cfc309ae774ec3fab70e4cf1e0841d7e8826bb0bd"} Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.254133 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.254493 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-log" containerID="cri-o://1c071816e027744ad70de7f9ccd11e6896cb6217d495383eef6f66db52694e42" gracePeriod=30 Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.254570 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-httpd" containerID="cri-o://53b10d67a60fc5d18ff2f8559caaa2cba98240b0cb3fc37afa5b461f8b8c9ebf" gracePeriod=30 Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.449822 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7784c76494-zjhpz" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Jan 21 09:19:19 crc kubenswrapper[4618]: I0121 09:19:19.449973 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:19:20 crc kubenswrapper[4618]: I0121 09:19:20.244935 4618 generic.go:334] "Generic (PLEG): container finished" podID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerID="1c071816e027744ad70de7f9ccd11e6896cb6217d495383eef6f66db52694e42" exitCode=143 Jan 21 09:19:20 crc kubenswrapper[4618]: I0121 09:19:20.245027 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerDied","Data":"1c071816e027744ad70de7f9ccd11e6896cb6217d495383eef6f66db52694e42"} Jan 21 09:19:21 crc kubenswrapper[4618]: I0121 09:19:21.259239 4618 generic.go:334] "Generic (PLEG): container finished" podID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerID="baa19d216905e3374a20be89dd594efda419f21f7c345ef9ec8687bfc85c6046" exitCode=0 Jan 21 09:19:21 crc kubenswrapper[4618]: I0121 09:19:21.259331 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerDied","Data":"baa19d216905e3374a20be89dd594efda419f21f7c345ef9ec8687bfc85c6046"} Jan 21 09:19:21 crc kubenswrapper[4618]: I0121 09:19:21.949870 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": dial tcp 10.217.0.166:3000: connect: connection refused" Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.050917 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.051409 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c74af7ae-2eee-4b63-8515-0230ddf143c8" containerName="kube-state-metrics" containerID="cri-o://7b2631fd49caf28d1118fdfa040ce0694b5c656219d650d65cf61960b915de81" gracePeriod=30 Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.268619 4618 generic.go:334] "Generic (PLEG): container finished" podID="c74af7ae-2eee-4b63-8515-0230ddf143c8" containerID="7b2631fd49caf28d1118fdfa040ce0694b5c656219d650d65cf61960b915de81" exitCode=2 Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.268649 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c74af7ae-2eee-4b63-8515-0230ddf143c8","Type":"ContainerDied","Data":"7b2631fd49caf28d1118fdfa040ce0694b5c656219d650d65cf61960b915de81"} Jan 21 09:19:22 crc kubenswrapper[4618]: W0121 09:19:22.465239 4618 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf21bdd0_078c_45ea_9027_7c9c70f53513.slice/crio-da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453/pids.max": read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf21bdd0_078c_45ea_9027_7c9c70f53513.slice/crio-da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453/pids.max: no such device Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.618699 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.618993 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-log" containerID="cri-o://930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303" gracePeriod=30 Jan 21 09:19:22 crc kubenswrapper[4618]: I0121 09:19:22.619230 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-httpd" containerID="cri-o://ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87" gracePeriod=30 Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.304375 4618 generic.go:334] "Generic (PLEG): container finished" podID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerID="930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303" exitCode=143 Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.304579 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerDied","Data":"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303"} Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.311070 4618 generic.go:334] "Generic (PLEG): container finished" podID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerID="53b10d67a60fc5d18ff2f8559caaa2cba98240b0cb3fc37afa5b461f8b8c9ebf" exitCode=0 Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.311109 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerDied","Data":"53b10d67a60fc5d18ff2f8559caaa2cba98240b0cb3fc37afa5b461f8b8c9ebf"} Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.855042 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.951176 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:19:23 crc kubenswrapper[4618]: I0121 09:19:23.958354 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037396 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037568 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037679 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037711 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037728 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037754 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz64n\" (UniqueName: \"kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037770 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037805 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run\") pod \"df21bdd0-078c-45ea-9027-7c9c70f53513\" (UID: \"df21bdd0-078c-45ea-9027-7c9c70f53513\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.037874 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6dqq\" (UniqueName: \"kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq\") pod \"c74af7ae-2eee-4b63-8515-0230ddf143c8\" (UID: \"c74af7ae-2eee-4b63-8515-0230ddf143c8\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.038091 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs" (OuterVolumeSpecName: "logs") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.038347 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.038902 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.054918 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.054964 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts" (OuterVolumeSpecName: "scripts") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.055810 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n" (OuterVolumeSpecName: "kube-api-access-pz64n") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "kube-api-access-pz64n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.056445 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq" (OuterVolumeSpecName: "kube-api-access-w6dqq") pod "c74af7ae-2eee-4b63-8515-0230ddf143c8" (UID: "c74af7ae-2eee-4b63-8515-0230ddf143c8"). InnerVolumeSpecName "kube-api-access-w6dqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.072588 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.084490 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.086458 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data" (OuterVolumeSpecName: "config-data") pod "df21bdd0-078c-45ea-9027-7c9c70f53513" (UID: "df21bdd0-078c-45ea-9027-7c9c70f53513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.139317 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.139450 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8g8q\" (UniqueName: \"kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.139499 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.139549 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.139617 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.140109 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.140269 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.140466 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.140525 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle\") pod \"701c5495-43ad-4008-b8e2-57b5b8a18d56\" (UID: \"701c5495-43ad-4008-b8e2-57b5b8a18d56\") " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141129 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6dqq\" (UniqueName: \"kubernetes.io/projected/c74af7ae-2eee-4b63-8515-0230ddf143c8-kube-api-access-w6dqq\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141173 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141184 4618 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141195 4618 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701c5495-43ad-4008-b8e2-57b5b8a18d56-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141204 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141215 4618 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141226 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df21bdd0-078c-45ea-9027-7c9c70f53513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141236 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz64n\" (UniqueName: \"kubernetes.io/projected/df21bdd0-078c-45ea-9027-7c9c70f53513-kube-api-access-pz64n\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141273 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.141286 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df21bdd0-078c-45ea-9027-7c9c70f53513-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.145380 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts" (OuterVolumeSpecName: "scripts") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.145463 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q" (OuterVolumeSpecName: "kube-api-access-w8g8q") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "kube-api-access-w8g8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.160490 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.162552 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.206982 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.219355 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data" (OuterVolumeSpecName: "config-data") pod "701c5495-43ad-4008-b8e2-57b5b8a18d56" (UID: "701c5495-43ad-4008-b8e2-57b5b8a18d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243801 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243829 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8g8q\" (UniqueName: \"kubernetes.io/projected/701c5495-43ad-4008-b8e2-57b5b8a18d56-kube-api-access-w8g8q\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243846 4618 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243860 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243871 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701c5495-43ad-4008-b8e2-57b5b8a18d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.243884 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.323744 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df21bdd0-078c-45ea-9027-7c9c70f53513","Type":"ContainerDied","Data":"da2ce516038bf4e2e2eed53a4f83842563411a6fb841009fec406e4987a5c453"} Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.323786 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.323852 4618 scope.go:117] "RemoveContainer" containerID="53b10d67a60fc5d18ff2f8559caaa2cba98240b0cb3fc37afa5b461f8b8c9ebf" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.327434 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.327476 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c74af7ae-2eee-4b63-8515-0230ddf143c8","Type":"ContainerDied","Data":"07c40f291cfd3f5d0a4e9d742fd8d51723ed37a772932f5a5a4e041ea0136833"} Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.333122 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701c5495-43ad-4008-b8e2-57b5b8a18d56","Type":"ContainerDied","Data":"35c9eb62674b9f64c9e3fa6345544f11dc4cb87fa001cf46ffb1c2594a99e950"} Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.333329 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.338968 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6039b2d9-1ca5-480a-a1a4-f5ec50e082aa","Type":"ContainerStarted","Data":"bb725499451bee7a5c9540a5b0a40c693529e218d69d301f99fcddbf10fddf59"} Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.352610 4618 scope.go:117] "RemoveContainer" containerID="1c071816e027744ad70de7f9ccd11e6896cb6217d495383eef6f66db52694e42" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.369786 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.929385827 podStartE2EDuration="11.369773838s" podCreationTimestamp="2026-01-21 09:19:13 +0000 UTC" firstStartedPulling="2026-01-21 09:19:14.078129198 +0000 UTC m=+952.828596516" lastFinishedPulling="2026-01-21 09:19:23.51851721 +0000 UTC m=+962.268984527" observedRunningTime="2026-01-21 09:19:24.353514961 +0000 UTC m=+963.103982278" watchObservedRunningTime="2026-01-21 09:19:24.369773838 +0000 UTC m=+963.120241155" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.378485 4618 scope.go:117] "RemoveContainer" containerID="7b2631fd49caf28d1118fdfa040ce0694b5c656219d650d65cf61960b915de81" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.389300 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.401065 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.476215 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478253 4618 scope.go:117] "RemoveContainer" containerID="ac778dbfaff7210fda0908dbfdb4dfbf29e869c826ae63523e69c48be89afeb5" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478360 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-log" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478397 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-log" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478453 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="proxy-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478466 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="proxy-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478480 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478488 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478515 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74af7ae-2eee-4b63-8515-0230ddf143c8" containerName="kube-state-metrics" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478521 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74af7ae-2eee-4b63-8515-0230ddf143c8" containerName="kube-state-metrics" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478536 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-notification-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478543 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-notification-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478555 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-central-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478568 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-central-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: E0121 09:19:24.478577 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="sg-core" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478583 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="sg-core" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478777 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-log" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478797 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" containerName="glance-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478811 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="proxy-httpd" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478820 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="sg-core" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478829 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-notification-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478838 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" containerName="ceilometer-central-agent" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.478850 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74af7ae-2eee-4b63-8515-0230ddf143c8" containerName="kube-state-metrics" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.480330 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.483575 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-f9lpg" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.483588 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.483743 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.494572 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.508241 4618 scope.go:117] "RemoveContainer" containerID="a2825559adc6707e7f5bb3329894c71e58af4f56342f9e8f13341a4cfc919cc1" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.518440 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.524694 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.527670 4618 scope.go:117] "RemoveContainer" containerID="baa19d216905e3374a20be89dd594efda419f21f7c345ef9ec8687bfc85c6046" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.532253 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.537167 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.541798 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.543342 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.548164 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.549063 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.549124 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.550337 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.552171 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.553048 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.553497 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.553729 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.555191 4618 scope.go:117] "RemoveContainer" containerID="0d0c716bc5e69ae3e0aa858cfc309ae774ec3fab70e4cf1e0841d7e8826bb0bd" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.561414 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682576 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682624 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682647 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsfpz\" (UniqueName: \"kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682666 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682710 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682728 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682783 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682801 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682821 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682857 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682902 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682925 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682948 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.682999 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683015 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kstqj\" (UniqueName: \"kubernetes.io/projected/f8cbddef-d1fd-490f-b499-3a9d2e570bce-kube-api-access-kstqj\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683042 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5kr6\" (UniqueName: \"kubernetes.io/projected/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-api-access-v5kr6\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683073 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-logs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683089 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683114 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.683165 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785174 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785471 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785506 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785545 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785593 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785611 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785631 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785689 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785709 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kstqj\" (UniqueName: \"kubernetes.io/projected/f8cbddef-d1fd-490f-b499-3a9d2e570bce-kube-api-access-kstqj\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785734 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5kr6\" (UniqueName: \"kubernetes.io/projected/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-api-access-v5kr6\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785760 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-logs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785776 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785803 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785832 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785883 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785908 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785927 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsfpz\" (UniqueName: \"kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785946 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785948 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785976 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.785993 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.786333 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8cbddef-d1fd-490f-b499-3a9d2e570bce-logs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.786621 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.786643 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.787092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.790221 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.790510 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.790636 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.791855 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.792594 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.796106 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.796944 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.799109 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.800572 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.800890 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cbddef-d1fd-490f-b499-3a9d2e570bce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.801159 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.801374 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.802707 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5kr6\" (UniqueName: \"kubernetes.io/projected/8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3-kube-api-access-v5kr6\") pod \"kube-state-metrics-0\" (UID: \"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3\") " pod="openstack/kube-state-metrics-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.808309 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kstqj\" (UniqueName: \"kubernetes.io/projected/f8cbddef-d1fd-490f-b499-3a9d2e570bce-kube-api-access-kstqj\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.810228 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsfpz\" (UniqueName: \"kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz\") pod \"ceilometer-0\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " pod="openstack/ceilometer-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.826346 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f8cbddef-d1fd-490f-b499-3a9d2e570bce\") " pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.861343 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 21 09:19:24 crc kubenswrapper[4618]: I0121 09:19:24.883852 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.059844 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.101925 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.194879 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cjcd\" (UniqueName: \"kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195131 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195202 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195313 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195359 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195385 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195408 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle\") pod \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\" (UID: \"696c8b1d-e84a-45de-bb32-d2b5526bfabc\") " Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.195752 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs" (OuterVolumeSpecName: "logs") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.196199 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c8b1d-e84a-45de-bb32-d2b5526bfabc-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.203750 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd" (OuterVolumeSpecName: "kube-api-access-6cjcd") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "kube-api-access-6cjcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.215198 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.237485 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.240332 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts" (OuterVolumeSpecName: "scripts") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.255571 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data" (OuterVolumeSpecName: "config-data") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.267356 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "696c8b1d-e84a-45de-bb32-d2b5526bfabc" (UID: "696c8b1d-e84a-45de-bb32-d2b5526bfabc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299004 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299037 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/696c8b1d-e84a-45de-bb32-d2b5526bfabc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299048 4618 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299060 4618 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299069 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c8b1d-e84a-45de-bb32-d2b5526bfabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.299080 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cjcd\" (UniqueName: \"kubernetes.io/projected/696c8b1d-e84a-45de-bb32-d2b5526bfabc-kube-api-access-6cjcd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.358342 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:25 crc kubenswrapper[4618]: W0121 09:19:25.362351 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72396e2c_0937_41a4_936d_cea26e57a2f2.slice/crio-8f14950f299148156d5e98895e50527ae67700863ac4efcf478913652e55919a WatchSource:0}: Error finding container 8f14950f299148156d5e98895e50527ae67700863ac4efcf478913652e55919a: Status 404 returned error can't find the container with id 8f14950f299148156d5e98895e50527ae67700863ac4efcf478913652e55919a Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.368054 4618 generic.go:334] "Generic (PLEG): container finished" podID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerID="9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7" exitCode=137 Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.368983 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7784c76494-zjhpz" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.371269 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerDied","Data":"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7"} Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.371316 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7784c76494-zjhpz" event={"ID":"696c8b1d-e84a-45de-bb32-d2b5526bfabc","Type":"ContainerDied","Data":"ff270c36f1c149aba25c0e947bee74d8d5a7a61be871551f3b9ec9d234a663fc"} Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.371336 4618 scope.go:117] "RemoveContainer" containerID="8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.452459 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.461422 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7784c76494-zjhpz"] Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.528613 4618 scope.go:117] "RemoveContainer" containerID="9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.546812 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" path="/var/lib/kubelet/pods/696c8b1d-e84a-45de-bb32-d2b5526bfabc/volumes" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.547705 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701c5495-43ad-4008-b8e2-57b5b8a18d56" path="/var/lib/kubelet/pods/701c5495-43ad-4008-b8e2-57b5b8a18d56/volumes" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.548541 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74af7ae-2eee-4b63-8515-0230ddf143c8" path="/var/lib/kubelet/pods/c74af7ae-2eee-4b63-8515-0230ddf143c8/volumes" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.549820 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df21bdd0-078c-45ea-9027-7c9c70f53513" path="/var/lib/kubelet/pods/df21bdd0-078c-45ea-9027-7c9c70f53513/volumes" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.555049 4618 scope.go:117] "RemoveContainer" containerID="8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074" Jan 21 09:19:25 crc kubenswrapper[4618]: E0121 09:19:25.556413 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074\": container with ID starting with 8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074 not found: ID does not exist" containerID="8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.556461 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074"} err="failed to get container status \"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074\": rpc error: code = NotFound desc = could not find container \"8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074\": container with ID starting with 8e40f7e4097dba4b118ab356bf3f0556a79c6b1a39a6e561090055f6e5789074 not found: ID does not exist" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.556490 4618 scope.go:117] "RemoveContainer" containerID="9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7" Jan 21 09:19:25 crc kubenswrapper[4618]: E0121 09:19:25.557044 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7\": container with ID starting with 9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7 not found: ID does not exist" containerID="9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7" Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.557077 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7"} err="failed to get container status \"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7\": rpc error: code = NotFound desc = could not find container \"9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7\": container with ID starting with 9526520b66b5fecbf7e108b070ad1e4f23bfe5421e937f5adbdd92e74c13f4d7 not found: ID does not exist" Jan 21 09:19:25 crc kubenswrapper[4618]: W0121 09:19:25.563457 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8cbddef_d1fd_490f_b499_3a9d2e570bce.slice/crio-571233922ff05041fecb7d699bd1cb1888c04f9881cf39d61490f0321a4e434b WatchSource:0}: Error finding container 571233922ff05041fecb7d699bd1cb1888c04f9881cf39d61490f0321a4e434b: Status 404 returned error can't find the container with id 571233922ff05041fecb7d699bd1cb1888c04f9881cf39d61490f0321a4e434b Jan 21 09:19:25 crc kubenswrapper[4618]: W0121 09:19:25.566371 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bca57d0_25ca_4daa_a96b_b3b70b2a2ac3.slice/crio-68ba77f85fa428d7c0fb1b88bf8c8135a5b784b859f0a53f08fe2fedc2a03d9a WatchSource:0}: Error finding container 68ba77f85fa428d7c0fb1b88bf8c8135a5b784b859f0a53f08fe2fedc2a03d9a: Status 404 returned error can't find the container with id 68ba77f85fa428d7c0fb1b88bf8c8135a5b784b859f0a53f08fe2fedc2a03d9a Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.572746 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 09:19:25 crc kubenswrapper[4618]: I0121 09:19:25.579724 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.278249 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.380691 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.452933 4618 generic.go:334] "Generic (PLEG): container finished" podID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerID="ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87" exitCode=0 Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.453001 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerDied","Data":"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.453030 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a68fe88-de0c-468d-998a-77d5d5a29d83","Type":"ContainerDied","Data":"31c5723af720f3fceb3e1a8ddcbef8a950e7f6f788a7959ebf25a85b567766b5"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.453035 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.453049 4618 scope.go:117] "RemoveContainer" containerID="ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.457030 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8cbddef-d1fd-490f-b499-3a9d2e570bce","Type":"ContainerStarted","Data":"f02f835e5dda944d53f71aac5415d20998d955a9345884c34077cdf55ed6754a"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.457079 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8cbddef-d1fd-490f-b499-3a9d2e570bce","Type":"ContainerStarted","Data":"571233922ff05041fecb7d699bd1cb1888c04f9881cf39d61490f0321a4e434b"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.469452 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3","Type":"ContainerStarted","Data":"05079e5290392932a330523165f4ca64efb84a973d868e93d59de3d7d194a3c9"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.469510 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3","Type":"ContainerStarted","Data":"68ba77f85fa428d7c0fb1b88bf8c8135a5b784b859f0a53f08fe2fedc2a03d9a"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.470642 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.486483 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerStarted","Data":"8f14950f299148156d5e98895e50527ae67700863ac4efcf478913652e55919a"} Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.488592 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.181356523 podStartE2EDuration="2.488550074s" podCreationTimestamp="2026-01-21 09:19:24 +0000 UTC" firstStartedPulling="2026-01-21 09:19:25.569116369 +0000 UTC m=+964.319583686" lastFinishedPulling="2026-01-21 09:19:25.87630992 +0000 UTC m=+964.626777237" observedRunningTime="2026-01-21 09:19:26.484551875 +0000 UTC m=+965.235019192" watchObservedRunningTime="2026-01-21 09:19:26.488550074 +0000 UTC m=+965.239017380" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.501493 4618 scope.go:117] "RemoveContainer" containerID="930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.537436 4618 scope.go:117] "RemoveContainer" containerID="ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87" Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.537818 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87\": container with ID starting with ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87 not found: ID does not exist" containerID="ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.537851 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87"} err="failed to get container status \"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87\": rpc error: code = NotFound desc = could not find container \"ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87\": container with ID starting with ed4512f752e5f1f36e556bd6fd44a82858d003c6865d1fc09dc2a99380c7ab87 not found: ID does not exist" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.537871 4618 scope.go:117] "RemoveContainer" containerID="930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303" Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.538097 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303\": container with ID starting with 930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303 not found: ID does not exist" containerID="930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.538122 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303"} err="failed to get container status \"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303\": rpc error: code = NotFound desc = could not find container \"930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303\": container with ID starting with 930399f059f2f18668505f9c6db60bd4502b0eee9b5213ca9729778f189f5303 not found: ID does not exist" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540000 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540201 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wlp9\" (UniqueName: \"kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540438 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540540 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540599 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.540635 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.541160 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.541212 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"7a68fe88-de0c-468d-998a-77d5d5a29d83\" (UID: \"7a68fe88-de0c-468d-998a-77d5d5a29d83\") " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.542335 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs" (OuterVolumeSpecName: "logs") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.542359 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.546351 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9" (OuterVolumeSpecName: "kube-api-access-4wlp9") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "kube-api-access-4wlp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.547040 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts" (OuterVolumeSpecName: "scripts") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.561161 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.569787 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.604839 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.612309 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data" (OuterVolumeSpecName: "config-data") pod "7a68fe88-de0c-468d-998a-77d5d5a29d83" (UID: "7a68fe88-de0c-468d-998a-77d5d5a29d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.622479 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.633025 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d5bd5664f-ncbh6" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643580 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643795 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643806 4618 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643816 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wlp9\" (UniqueName: \"kubernetes.io/projected/7a68fe88-de0c-468d-998a-77d5d5a29d83-kube-api-access-4wlp9\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643825 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643833 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643840 4618 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a68fe88-de0c-468d-998a-77d5d5a29d83-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.643848 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a68fe88-de0c-468d-998a-77d5d5a29d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.689415 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.745941 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.804027 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.824187 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.835559 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.836016 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-log" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836036 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-log" Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.836053 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836059 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.836087 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon-log" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836093 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon-log" Jan 21 09:19:26 crc kubenswrapper[4618]: E0121 09:19:26.836110 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-httpd" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836115 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-httpd" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836281 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-log" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836295 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon-log" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836302 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="696c8b1d-e84a-45de-bb32-d2b5526bfabc" containerName="horizon" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.836317 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" containerName="glance-httpd" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.837288 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.839931 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.840110 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.843870 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.957877 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.957958 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.957990 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pks99\" (UniqueName: \"kubernetes.io/projected/c616426c-57a8-42a0-8dde-7ef7f56caf00-kube-api-access-pks99\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.958136 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-logs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.958219 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.958364 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.958407 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:26 crc kubenswrapper[4618]: I0121 09:19:26.958433 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060333 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060381 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060411 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pks99\" (UniqueName: \"kubernetes.io/projected/c616426c-57a8-42a0-8dde-7ef7f56caf00-kube-api-access-pks99\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060469 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-logs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060504 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060545 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060572 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.060592 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.061055 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.061449 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-logs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.061521 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c616426c-57a8-42a0-8dde-7ef7f56caf00-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.065934 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.078081 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.078267 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.078528 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c616426c-57a8-42a0-8dde-7ef7f56caf00-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.078613 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pks99\" (UniqueName: \"kubernetes.io/projected/c616426c-57a8-42a0-8dde-7ef7f56caf00-kube-api-access-pks99\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.097599 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c616426c-57a8-42a0-8dde-7ef7f56caf00\") " pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.163487 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.495172 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerStarted","Data":"8bebff36622de6f26be05b13414eeee8dc0300e7f34479d07c669cd9e3f77431"} Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.499442 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8cbddef-d1fd-490f-b499-3a9d2e570bce","Type":"ContainerStarted","Data":"691ce6b25383bedfbe89643bb6d59c1408769cbf14f27990889389d3fca1cc5e"} Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.517980 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.5179558159999997 podStartE2EDuration="3.517955816s" podCreationTimestamp="2026-01-21 09:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:19:27.514369782 +0000 UTC m=+966.264837088" watchObservedRunningTime="2026-01-21 09:19:27.517955816 +0000 UTC m=+966.268423122" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.561116 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a68fe88-de0c-468d-998a-77d5d5a29d83" path="/var/lib/kubelet/pods/7a68fe88-de0c-468d-998a-77d5d5a29d83/volumes" Jan 21 09:19:27 crc kubenswrapper[4618]: I0121 09:19:27.652552 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 21 09:19:28 crc kubenswrapper[4618]: I0121 09:19:28.518535 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerStarted","Data":"6956536ab57bf3dc5c771cfcc576bf92dc75007d515efbe50a97652735addbb2"} Jan 21 09:19:28 crc kubenswrapper[4618]: I0121 09:19:28.519091 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerStarted","Data":"202fffe68ae82cc2ce7c59052ee42fdd5e228725a57bfc20ced4fb745ebc72a0"} Jan 21 09:19:28 crc kubenswrapper[4618]: I0121 09:19:28.530178 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c616426c-57a8-42a0-8dde-7ef7f56caf00","Type":"ContainerStarted","Data":"5537822e402ea79fd264d4897130528d855097b5698973b75366b00eb53e3471"} Jan 21 09:19:28 crc kubenswrapper[4618]: I0121 09:19:28.530210 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c616426c-57a8-42a0-8dde-7ef7f56caf00","Type":"ContainerStarted","Data":"3ad338446d221a02f8218fe7e834d8a850bc7218bbe5a8019217685c6483cf6e"} Jan 21 09:19:29 crc kubenswrapper[4618]: I0121 09:19:29.547294 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c616426c-57a8-42a0-8dde-7ef7f56caf00","Type":"ContainerStarted","Data":"ba7a24bebc703a42daf2d3178577b02cec07841dd4c666b618846ff4b52f46ee"} Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.562834 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerStarted","Data":"b6e5327b84c8a3fe6f2e7f8c2f6a3ad921b11cdaa5a71443aab35dcb2917170d"} Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.563742 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-central-agent" containerID="cri-o://8bebff36622de6f26be05b13414eeee8dc0300e7f34479d07c669cd9e3f77431" gracePeriod=30 Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.563943 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.563985 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="proxy-httpd" containerID="cri-o://b6e5327b84c8a3fe6f2e7f8c2f6a3ad921b11cdaa5a71443aab35dcb2917170d" gracePeriod=30 Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.564032 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="sg-core" containerID="cri-o://6956536ab57bf3dc5c771cfcc576bf92dc75007d515efbe50a97652735addbb2" gracePeriod=30 Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.564066 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-notification-agent" containerID="cri-o://202fffe68ae82cc2ce7c59052ee42fdd5e228725a57bfc20ced4fb745ebc72a0" gracePeriod=30 Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.571100 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.571085023 podStartE2EDuration="5.571085023s" podCreationTimestamp="2026-01-21 09:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:19:29.571901198 +0000 UTC m=+968.322368515" watchObservedRunningTime="2026-01-21 09:19:31.571085023 +0000 UTC m=+970.321552331" Jan 21 09:19:31 crc kubenswrapper[4618]: I0121 09:19:31.590555 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.576974948 podStartE2EDuration="7.590540942s" podCreationTimestamp="2026-01-21 09:19:24 +0000 UTC" firstStartedPulling="2026-01-21 09:19:25.364396244 +0000 UTC m=+964.114863561" lastFinishedPulling="2026-01-21 09:19:30.377962238 +0000 UTC m=+969.128429555" observedRunningTime="2026-01-21 09:19:31.584417906 +0000 UTC m=+970.334885214" watchObservedRunningTime="2026-01-21 09:19:31.590540942 +0000 UTC m=+970.341008259" Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577728 4618 generic.go:334] "Generic (PLEG): container finished" podID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerID="b6e5327b84c8a3fe6f2e7f8c2f6a3ad921b11cdaa5a71443aab35dcb2917170d" exitCode=0 Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577772 4618 generic.go:334] "Generic (PLEG): container finished" podID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerID="6956536ab57bf3dc5c771cfcc576bf92dc75007d515efbe50a97652735addbb2" exitCode=2 Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577781 4618 generic.go:334] "Generic (PLEG): container finished" podID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerID="202fffe68ae82cc2ce7c59052ee42fdd5e228725a57bfc20ced4fb745ebc72a0" exitCode=0 Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577809 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerDied","Data":"b6e5327b84c8a3fe6f2e7f8c2f6a3ad921b11cdaa5a71443aab35dcb2917170d"} Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577887 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerDied","Data":"6956536ab57bf3dc5c771cfcc576bf92dc75007d515efbe50a97652735addbb2"} Jan 21 09:19:32 crc kubenswrapper[4618]: I0121 09:19:32.577901 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerDied","Data":"202fffe68ae82cc2ce7c59052ee42fdd5e228725a57bfc20ced4fb745ebc72a0"} Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.607508 4618 generic.go:334] "Generic (PLEG): container finished" podID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerID="8bebff36622de6f26be05b13414eeee8dc0300e7f34479d07c669cd9e3f77431" exitCode=0 Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.607589 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerDied","Data":"8bebff36622de6f26be05b13414eeee8dc0300e7f34479d07c669cd9e3f77431"} Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.820256 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.862008 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.862378 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.890966 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.895412 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.925657 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.925782 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsfpz\" (UniqueName: \"kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.925879 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.925904 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.925966 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.926006 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.926049 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.926076 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data\") pod \"72396e2c-0937-41a4-936d-cea26e57a2f2\" (UID: \"72396e2c-0937-41a4-936d-cea26e57a2f2\") " Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.926888 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.926974 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.943374 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz" (OuterVolumeSpecName: "kube-api-access-gsfpz") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "kube-api-access-gsfpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.944756 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts" (OuterVolumeSpecName: "scripts") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.957031 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:34 crc kubenswrapper[4618]: I0121 09:19:34.989978 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.015440 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029178 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029261 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsfpz\" (UniqueName: \"kubernetes.io/projected/72396e2c-0937-41a4-936d-cea26e57a2f2-kube-api-access-gsfpz\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029327 4618 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029384 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029435 4618 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029481 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.029531 4618 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72396e2c-0937-41a4-936d-cea26e57a2f2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.034779 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data" (OuterVolumeSpecName: "config-data") pod "72396e2c-0937-41a4-936d-cea26e57a2f2" (UID: "72396e2c-0937-41a4-936d-cea26e57a2f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.114933 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.131522 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72396e2c-0937-41a4-936d-cea26e57a2f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.470232 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c9fb5"] Jan 21 09:19:35 crc kubenswrapper[4618]: E0121 09:19:35.470925 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="sg-core" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.470943 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="sg-core" Jan 21 09:19:35 crc kubenswrapper[4618]: E0121 09:19:35.470964 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-notification-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.470970 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-notification-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: E0121 09:19:35.470985 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-central-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.470990 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-central-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: E0121 09:19:35.470999 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="proxy-httpd" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471004 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="proxy-httpd" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471218 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-central-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471239 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="ceilometer-notification-agent" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471253 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="sg-core" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471269 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" containerName="proxy-httpd" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.471831 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.476614 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-63df-account-create-update-sssqx"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.477864 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.478956 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.502199 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-63df-account-create-update-sssqx"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.532006 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c9fb5"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.542810 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.542928 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2xm\" (UniqueName: \"kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.591056 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k47ps"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.592684 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.605826 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k47ps"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.619984 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72396e2c-0937-41a4-936d-cea26e57a2f2","Type":"ContainerDied","Data":"8f14950f299148156d5e98895e50527ae67700863ac4efcf478913652e55919a"} Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.620044 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.620070 4618 scope.go:117] "RemoveContainer" containerID="b6e5327b84c8a3fe6f2e7f8c2f6a3ad921b11cdaa5a71443aab35dcb2917170d" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.620076 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.620524 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.646223 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.646394 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2xm\" (UniqueName: \"kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.646529 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr28v\" (UniqueName: \"kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.646676 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.647396 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.655621 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.658880 4618 scope.go:117] "RemoveContainer" containerID="6956536ab57bf3dc5c771cfcc576bf92dc75007d515efbe50a97652735addbb2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.671853 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2xm\" (UniqueName: \"kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm\") pod \"nova-api-db-create-c9fb5\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.681813 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.690334 4618 scope.go:117] "RemoveContainer" containerID="202fffe68ae82cc2ce7c59052ee42fdd5e228725a57bfc20ced4fb745ebc72a0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.691459 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.693617 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.696715 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.703055 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.703250 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.733766 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wvhr2"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.735346 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.751069 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz8h\" (UniqueName: \"kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.751124 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr28v\" (UniqueName: \"kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.751321 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.751485 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.752964 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.756652 4618 scope.go:117] "RemoveContainer" containerID="8bebff36622de6f26be05b13414eeee8dc0300e7f34479d07c669cd9e3f77431" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.762191 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d612-account-create-update-6whq9"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.763456 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.765170 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.783352 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.805181 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wvhr2"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.815569 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d612-account-create-update-6whq9"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.817326 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr28v\" (UniqueName: \"kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v\") pod \"nova-api-63df-account-create-update-sssqx\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855177 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855238 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855265 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855296 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855332 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855348 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrg6n\" (UniqueName: \"kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855403 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7btk\" (UniqueName: \"kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855429 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855469 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz8h\" (UniqueName: \"kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855490 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855513 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855536 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2s4b\" (UniqueName: \"kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855572 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.855610 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.856342 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.857418 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.857588 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.884522 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz8h\" (UniqueName: \"kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h\") pod \"nova-cell0-db-create-k47ps\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.908731 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.937193 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9f33-account-create-update-nhlhp"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.938433 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.945025 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.959970 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960013 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960043 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2s4b\" (UniqueName: \"kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960095 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960135 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960192 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960227 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960262 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960304 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960321 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrg6n\" (UniqueName: \"kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960386 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7btk\" (UniqueName: \"kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960414 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960634 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.960895 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.962914 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.966755 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f33-account-create-update-nhlhp"] Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.972585 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.976297 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.978750 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.984836 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2s4b\" (UniqueName: \"kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b\") pod \"nova-cell0-d612-account-create-update-6whq9\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.987727 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.988476 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.988868 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.990630 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7btk\" (UniqueName: \"kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk\") pod \"nova-cell1-db-create-wvhr2\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:35 crc kubenswrapper[4618]: I0121 09:19:35.991872 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrg6n\" (UniqueName: \"kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n\") pod \"ceilometer-0\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " pod="openstack/ceilometer-0" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.039175 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.062157 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.062349 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfk5\" (UniqueName: \"kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.081529 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.086922 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.163916 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfk5\" (UniqueName: \"kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.164017 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.164947 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.183222 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfk5\" (UniqueName: \"kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5\") pod \"nova-cell1-9f33-account-create-update-nhlhp\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.288485 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.439274 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c9fb5"] Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.533998 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-63df-account-create-update-sssqx"] Jan 21 09:19:36 crc kubenswrapper[4618]: W0121 09:19:36.539641 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86b5bcc6_7694_4b4b_b68d_44eaf7b90eb6.slice/crio-46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8 WatchSource:0}: Error finding container 46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8: Status 404 returned error can't find the container with id 46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8 Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.616434 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k47ps"] Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.629898 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.654944 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wvhr2"] Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.673804 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47ps" event={"ID":"b2cf19dc-4223-41c2-a849-d02a34917dad","Type":"ContainerStarted","Data":"ffcdd972f207c8fc5bcd240fd45d51615b7b2358f64273759e0fae64b503a809"} Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.679878 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-63df-account-create-update-sssqx" event={"ID":"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6","Type":"ContainerStarted","Data":"46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8"} Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.681683 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c9fb5" event={"ID":"1a9602ab-6ad6-4b2a-b142-5b646433ed19","Type":"ContainerStarted","Data":"21f4bcb63047ade38e1cad8da0ea5abe3345e973290a68a76d77dd91ccb5d94b"} Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.763453 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d612-account-create-update-6whq9"] Jan 21 09:19:36 crc kubenswrapper[4618]: W0121 09:19:36.770511 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc06dae48_dad2_4fc4_8083_5a83b1cb6eb7.slice/crio-d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f WatchSource:0}: Error finding container d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f: Status 404 returned error can't find the container with id d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f Jan 21 09:19:36 crc kubenswrapper[4618]: I0121 09:19:36.853447 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f33-account-create-update-nhlhp"] Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.163991 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.164304 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.192079 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.193173 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.550703 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72396e2c-0937-41a4-936d-cea26e57a2f2" path="/var/lib/kubelet/pods/72396e2c-0937-41a4-936d-cea26e57a2f2/volumes" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.694341 4618 generic.go:334] "Generic (PLEG): container finished" podID="1a9602ab-6ad6-4b2a-b142-5b646433ed19" containerID="0abcf3420c84426b5c3032545f83baf2a312536ae71853c86b42c0c18c469ba2" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.694462 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c9fb5" event={"ID":"1a9602ab-6ad6-4b2a-b142-5b646433ed19","Type":"ContainerDied","Data":"0abcf3420c84426b5c3032545f83baf2a312536ae71853c86b42c0c18c469ba2"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.696514 4618 generic.go:334] "Generic (PLEG): container finished" podID="b2cf19dc-4223-41c2-a849-d02a34917dad" containerID="c926bae56d493038364720fdbe87a9f55b9377cd0c66f33d09c541c0cc66af11" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.696629 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47ps" event={"ID":"b2cf19dc-4223-41c2-a849-d02a34917dad","Type":"ContainerDied","Data":"c926bae56d493038364720fdbe87a9f55b9377cd0c66f33d09c541c0cc66af11"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.699839 4618 generic.go:334] "Generic (PLEG): container finished" podID="bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" containerID="68008014429ca04cf8d1c188b218e685929dcdba214684e5658bfd4718a27db8" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.699910 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" event={"ID":"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b","Type":"ContainerDied","Data":"68008014429ca04cf8d1c188b218e685929dcdba214684e5658bfd4718a27db8"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.699985 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" event={"ID":"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b","Type":"ContainerStarted","Data":"33a19b576b8f5c87b5a157b6ddad3df1b7f438faa748d2b40fe04fbd6a273726"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.702196 4618 generic.go:334] "Generic (PLEG): container finished" podID="c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" containerID="c68d8e997c123e5fc8bccc47d13892131b96bbafa56501a9813c2fd85303e947" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.702289 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d612-account-create-update-6whq9" event={"ID":"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7","Type":"ContainerDied","Data":"c68d8e997c123e5fc8bccc47d13892131b96bbafa56501a9813c2fd85303e947"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.702333 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d612-account-create-update-6whq9" event={"ID":"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7","Type":"ContainerStarted","Data":"d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.703984 4618 generic.go:334] "Generic (PLEG): container finished" podID="03c89387-da4a-4f04-ae29-8acf81d8c18f" containerID="f6950cb02ef07e93504da2db387baf84efb0e5ade66a6dc20e96b3da9cea4692" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.704039 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wvhr2" event={"ID":"03c89387-da4a-4f04-ae29-8acf81d8c18f","Type":"ContainerDied","Data":"f6950cb02ef07e93504da2db387baf84efb0e5ade66a6dc20e96b3da9cea4692"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.704063 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wvhr2" event={"ID":"03c89387-da4a-4f04-ae29-8acf81d8c18f","Type":"ContainerStarted","Data":"1deed34c59e5e1cb572eaa77aaac42723cf9ab825a35a4301eac5aa614e71917"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.706532 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerStarted","Data":"4523d5223dc263eb1295a3ff5c472317f398a6a5f2b599ae79353c61ef0d36c1"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.706562 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerStarted","Data":"7187bc8a634b81ae13955ce34d72230a06d7ebb61d6233089d8a203d653cc53b"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.708552 4618 generic.go:334] "Generic (PLEG): container finished" podID="86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" containerID="b79c442395a0ab91161432ef7ec7c224815b7560661146b28dfd2f0b167f7388" exitCode=0 Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.708648 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-63df-account-create-update-sssqx" event={"ID":"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6","Type":"ContainerDied","Data":"b79c442395a0ab91161432ef7ec7c224815b7560661146b28dfd2f0b167f7388"} Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.708963 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.708999 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.914491 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.914611 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:19:37 crc kubenswrapper[4618]: I0121 09:19:37.918941 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 21 09:19:38 crc kubenswrapper[4618]: I0121 09:19:38.718468 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerStarted","Data":"4002cb1196fdf40f6ec255360ac87f1822cf4cb14e06d3e3ac592254977b9a3a"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.157259 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.326716 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr28v\" (UniqueName: \"kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v\") pod \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.327081 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts\") pod \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\" (UID: \"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.328266 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" (UID: "86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.331332 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v" (OuterVolumeSpecName: "kube-api-access-fr28v") pod "86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" (UID: "86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6"). InnerVolumeSpecName "kube-api-access-fr28v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.434874 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.435752 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr28v\" (UniqueName: \"kubernetes.io/projected/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6-kube-api-access-fr28v\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.463692 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.483082 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.485311 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.526748 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.557970 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.639829 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2xm\" (UniqueName: \"kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm\") pod \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.639913 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz8h\" (UniqueName: \"kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h\") pod \"b2cf19dc-4223-41c2-a849-d02a34917dad\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.639963 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7btk\" (UniqueName: \"kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk\") pod \"03c89387-da4a-4f04-ae29-8acf81d8c18f\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.640018 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2s4b\" (UniqueName: \"kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b\") pod \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.640108 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts\") pod \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\" (UID: \"1a9602ab-6ad6-4b2a-b142-5b646433ed19\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.640185 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts\") pod \"b2cf19dc-4223-41c2-a849-d02a34917dad\" (UID: \"b2cf19dc-4223-41c2-a849-d02a34917dad\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.640229 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts\") pod \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\" (UID: \"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.640289 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts\") pod \"03c89387-da4a-4f04-ae29-8acf81d8c18f\" (UID: \"03c89387-da4a-4f04-ae29-8acf81d8c18f\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.641798 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a9602ab-6ad6-4b2a-b142-5b646433ed19" (UID: "1a9602ab-6ad6-4b2a-b142-5b646433ed19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.642466 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" (UID: "c06dae48-dad2-4fc4-8083-5a83b1cb6eb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.644084 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2cf19dc-4223-41c2-a849-d02a34917dad" (UID: "b2cf19dc-4223-41c2-a849-d02a34917dad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.644180 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03c89387-da4a-4f04-ae29-8acf81d8c18f" (UID: "03c89387-da4a-4f04-ae29-8acf81d8c18f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.650308 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b" (OuterVolumeSpecName: "kube-api-access-t2s4b") pod "c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" (UID: "c06dae48-dad2-4fc4-8083-5a83b1cb6eb7"). InnerVolumeSpecName "kube-api-access-t2s4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.650340 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm" (OuterVolumeSpecName: "kube-api-access-dh2xm") pod "1a9602ab-6ad6-4b2a-b142-5b646433ed19" (UID: "1a9602ab-6ad6-4b2a-b142-5b646433ed19"). InnerVolumeSpecName "kube-api-access-dh2xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.650356 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk" (OuterVolumeSpecName: "kube-api-access-x7btk") pod "03c89387-da4a-4f04-ae29-8acf81d8c18f" (UID: "03c89387-da4a-4f04-ae29-8acf81d8c18f"). InnerVolumeSpecName "kube-api-access-x7btk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.651371 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h" (OuterVolumeSpecName: "kube-api-access-fhz8h") pod "b2cf19dc-4223-41c2-a849-d02a34917dad" (UID: "b2cf19dc-4223-41c2-a849-d02a34917dad"). InnerVolumeSpecName "kube-api-access-fhz8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.727867 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wvhr2" event={"ID":"03c89387-da4a-4f04-ae29-8acf81d8c18f","Type":"ContainerDied","Data":"1deed34c59e5e1cb572eaa77aaac42723cf9ab825a35a4301eac5aa614e71917"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.727936 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1deed34c59e5e1cb572eaa77aaac42723cf9ab825a35a4301eac5aa614e71917" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.727891 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wvhr2" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.730389 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerStarted","Data":"8a5dca9b3b50f6a120d7ecac33673d88b85b7761a56952aef9f11a198407e952"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.732030 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-63df-account-create-update-sssqx" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.732193 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-63df-account-create-update-sssqx" event={"ID":"86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6","Type":"ContainerDied","Data":"46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.732235 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46389b23e482aa1d0eeb3955f1135268534cc02e49f8e9ff47d7c6c9737d08a8" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.733813 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c9fb5" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.733816 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c9fb5" event={"ID":"1a9602ab-6ad6-4b2a-b142-5b646433ed19","Type":"ContainerDied","Data":"21f4bcb63047ade38e1cad8da0ea5abe3345e973290a68a76d77dd91ccb5d94b"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.733875 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f4bcb63047ade38e1cad8da0ea5abe3345e973290a68a76d77dd91ccb5d94b" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.735276 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k47ps" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.735275 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k47ps" event={"ID":"b2cf19dc-4223-41c2-a849-d02a34917dad","Type":"ContainerDied","Data":"ffcdd972f207c8fc5bcd240fd45d51615b7b2358f64273759e0fae64b503a809"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.735393 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffcdd972f207c8fc5bcd240fd45d51615b7b2358f64273759e0fae64b503a809" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.736935 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" event={"ID":"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b","Type":"ContainerDied","Data":"33a19b576b8f5c87b5a157b6ddad3df1b7f438faa748d2b40fe04fbd6a273726"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.736969 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a19b576b8f5c87b5a157b6ddad3df1b7f438faa748d2b40fe04fbd6a273726" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.736987 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f33-account-create-update-nhlhp" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.739620 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d612-account-create-update-6whq9" event={"ID":"c06dae48-dad2-4fc4-8083-5a83b1cb6eb7","Type":"ContainerDied","Data":"d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f"} Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.739667 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d612-account-create-update-6whq9" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.739678 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b45cc7a0d953dfef336871c5b0298eb56738ae4f6218f0d157e387e703f39f" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.746913 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts\") pod \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.746978 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfk5\" (UniqueName: \"kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5\") pod \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\" (UID: \"bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b\") " Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747380 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" (UID: "bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747659 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2cf19dc-4223-41c2-a849-d02a34917dad-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747672 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747681 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03c89387-da4a-4f04-ae29-8acf81d8c18f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747689 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747697 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2xm\" (UniqueName: \"kubernetes.io/projected/1a9602ab-6ad6-4b2a-b142-5b646433ed19-kube-api-access-dh2xm\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747707 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhz8h\" (UniqueName: \"kubernetes.io/projected/b2cf19dc-4223-41c2-a849-d02a34917dad-kube-api-access-fhz8h\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747715 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7btk\" (UniqueName: \"kubernetes.io/projected/03c89387-da4a-4f04-ae29-8acf81d8c18f-kube-api-access-x7btk\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747723 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2s4b\" (UniqueName: \"kubernetes.io/projected/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7-kube-api-access-t2s4b\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.747732 4618 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a9602ab-6ad6-4b2a-b142-5b646433ed19-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.753266 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5" (OuterVolumeSpecName: "kube-api-access-fgfk5") pod "bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" (UID: "bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b"). InnerVolumeSpecName "kube-api-access-fgfk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.849924 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgfk5\" (UniqueName: \"kubernetes.io/projected/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b-kube-api-access-fgfk5\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.861364 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.861528 4618 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 09:19:39 crc kubenswrapper[4618]: I0121 09:19:39.926996 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.945932 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-th4n8"] Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946621 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9602ab-6ad6-4b2a-b142-5b646433ed19" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946636 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9602ab-6ad6-4b2a-b142-5b646433ed19" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946653 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cf19dc-4223-41c2-a849-d02a34917dad" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946658 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cf19dc-4223-41c2-a849-d02a34917dad" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946671 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946677 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946696 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946701 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946713 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c89387-da4a-4f04-ae29-8acf81d8c18f" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946719 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c89387-da4a-4f04-ae29-8acf81d8c18f" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: E0121 09:19:40.946731 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946736 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946903 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9602ab-6ad6-4b2a-b142-5b646433ed19" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946913 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cf19dc-4223-41c2-a849-d02a34917dad" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946924 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946933 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c89387-da4a-4f04-ae29-8acf81d8c18f" containerName="mariadb-database-create" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946948 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.946963 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" containerName="mariadb-account-create-update" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.947605 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.958088 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.958609 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.958735 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5kswc" Jan 21 09:19:40 crc kubenswrapper[4618]: I0121 09:19:40.963989 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-th4n8"] Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.075881 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svv92\" (UniqueName: \"kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.075945 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.076047 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.076081 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.178191 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.178589 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.178811 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svv92\" (UniqueName: \"kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.178860 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.187165 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.191742 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.192813 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.198569 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svv92\" (UniqueName: \"kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92\") pod \"nova-cell0-conductor-db-sync-th4n8\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.270464 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.604125 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.702077 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-th4n8"] Jan 21 09:19:41 crc kubenswrapper[4618]: W0121 09:19:41.716417 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d3d0298_3dc5_4e2a_8f7a_6bf7a11bb1fc.slice/crio-9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c WatchSource:0}: Error finding container 9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c: Status 404 returned error can't find the container with id 9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.771115 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerStarted","Data":"f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb"} Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.771384 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.773844 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-th4n8" event={"ID":"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc","Type":"ContainerStarted","Data":"9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c"} Jan 21 09:19:41 crc kubenswrapper[4618]: I0121 09:19:41.794240 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.71367604 podStartE2EDuration="6.794220151s" podCreationTimestamp="2026-01-21 09:19:35 +0000 UTC" firstStartedPulling="2026-01-21 09:19:36.670959047 +0000 UTC m=+975.421426364" lastFinishedPulling="2026-01-21 09:19:40.751503158 +0000 UTC m=+979.501970475" observedRunningTime="2026-01-21 09:19:41.79096529 +0000 UTC m=+980.541432606" watchObservedRunningTime="2026-01-21 09:19:41.794220151 +0000 UTC m=+980.544687469" Jan 21 09:19:42 crc kubenswrapper[4618]: I0121 09:19:42.783970 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-central-agent" containerID="cri-o://4523d5223dc263eb1295a3ff5c472317f398a6a5f2b599ae79353c61ef0d36c1" gracePeriod=30 Jan 21 09:19:42 crc kubenswrapper[4618]: I0121 09:19:42.784984 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="proxy-httpd" containerID="cri-o://f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb" gracePeriod=30 Jan 21 09:19:42 crc kubenswrapper[4618]: I0121 09:19:42.785211 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="sg-core" containerID="cri-o://8a5dca9b3b50f6a120d7ecac33673d88b85b7761a56952aef9f11a198407e952" gracePeriod=30 Jan 21 09:19:42 crc kubenswrapper[4618]: I0121 09:19:42.785331 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-notification-agent" containerID="cri-o://4002cb1196fdf40f6ec255360ac87f1822cf4cb14e06d3e3ac592254977b9a3a" gracePeriod=30 Jan 21 09:19:43 crc kubenswrapper[4618]: E0121 09:19:43.217106 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290b1aba_2b06_40f3_87a2_3cb601cf9b63.slice/crio-conmon-f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290b1aba_2b06_40f3_87a2_3cb601cf9b63.slice/crio-f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb.scope\": RecentStats: unable to find data in memory cache]" Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796779 4618 generic.go:334] "Generic (PLEG): container finished" podID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerID="f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb" exitCode=0 Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796816 4618 generic.go:334] "Generic (PLEG): container finished" podID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerID="8a5dca9b3b50f6a120d7ecac33673d88b85b7761a56952aef9f11a198407e952" exitCode=2 Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796825 4618 generic.go:334] "Generic (PLEG): container finished" podID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerID="4002cb1196fdf40f6ec255360ac87f1822cf4cb14e06d3e3ac592254977b9a3a" exitCode=0 Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796847 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerDied","Data":"f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb"} Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796875 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerDied","Data":"8a5dca9b3b50f6a120d7ecac33673d88b85b7761a56952aef9f11a198407e952"} Jan 21 09:19:43 crc kubenswrapper[4618]: I0121 09:19:43.796885 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerDied","Data":"4002cb1196fdf40f6ec255360ac87f1822cf4cb14e06d3e3ac592254977b9a3a"} Jan 21 09:19:46 crc kubenswrapper[4618]: I0121 09:19:46.845129 4618 generic.go:334] "Generic (PLEG): container finished" podID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerID="4523d5223dc263eb1295a3ff5c472317f398a6a5f2b599ae79353c61ef0d36c1" exitCode=0 Jan 21 09:19:46 crc kubenswrapper[4618]: I0121 09:19:46.845185 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerDied","Data":"4523d5223dc263eb1295a3ff5c472317f398a6a5f2b599ae79353c61ef0d36c1"} Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.372511 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.479536 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.479806 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.479839 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.479876 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrg6n\" (UniqueName: \"kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.480062 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.480194 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.480302 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.480384 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle\") pod \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\" (UID: \"290b1aba-2b06-40f3-87a2-3cb601cf9b63\") " Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.480287 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.481216 4618 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.481526 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.484129 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n" (OuterVolumeSpecName: "kube-api-access-zrg6n") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "kube-api-access-zrg6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.484517 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts" (OuterVolumeSpecName: "scripts") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.507299 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.518218 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.554320 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data" (OuterVolumeSpecName: "config-data") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.557650 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "290b1aba-2b06-40f3-87a2-3cb601cf9b63" (UID: "290b1aba-2b06-40f3-87a2-3cb601cf9b63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583689 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583725 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583736 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrg6n\" (UniqueName: \"kubernetes.io/projected/290b1aba-2b06-40f3-87a2-3cb601cf9b63-kube-api-access-zrg6n\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583748 4618 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583757 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583769 4618 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/290b1aba-2b06-40f3-87a2-3cb601cf9b63-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.583778 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290b1aba-2b06-40f3-87a2-3cb601cf9b63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.876893 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"290b1aba-2b06-40f3-87a2-3cb601cf9b63","Type":"ContainerDied","Data":"7187bc8a634b81ae13955ce34d72230a06d7ebb61d6233089d8a203d653cc53b"} Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.876952 4618 scope.go:117] "RemoveContainer" containerID="f80e5670182a0036d557e5146323573b4480add57b80e85e8368fbc3c4e0cdcb" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.876956 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.878521 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-th4n8" event={"ID":"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc","Type":"ContainerStarted","Data":"d1cdfd5ad8454b9f8817f40bb774fc0ea3c8449535026c0c00aaf0aa372de33a"} Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.900168 4618 scope.go:117] "RemoveContainer" containerID="8a5dca9b3b50f6a120d7ecac33673d88b85b7761a56952aef9f11a198407e952" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.910811 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-th4n8" podStartSLOduration=2.347919245 podStartE2EDuration="9.910796758s" podCreationTimestamp="2026-01-21 09:19:40 +0000 UTC" firstStartedPulling="2026-01-21 09:19:41.719012972 +0000 UTC m=+980.469480290" lastFinishedPulling="2026-01-21 09:19:49.281890486 +0000 UTC m=+988.032357803" observedRunningTime="2026-01-21 09:19:49.907669104 +0000 UTC m=+988.658136412" watchObservedRunningTime="2026-01-21 09:19:49.910796758 +0000 UTC m=+988.661264074" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.918262 4618 scope.go:117] "RemoveContainer" containerID="4002cb1196fdf40f6ec255360ac87f1822cf4cb14e06d3e3ac592254977b9a3a" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.934092 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.939321 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.957786 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:49 crc kubenswrapper[4618]: E0121 09:19:49.962303 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-notification-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962329 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-notification-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: E0121 09:19:49.962355 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="sg-core" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962362 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="sg-core" Jan 21 09:19:49 crc kubenswrapper[4618]: E0121 09:19:49.962374 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="proxy-httpd" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962381 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="proxy-httpd" Jan 21 09:19:49 crc kubenswrapper[4618]: E0121 09:19:49.962403 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-central-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962409 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-central-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962630 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-central-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962671 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="ceilometer-notification-agent" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962686 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="proxy-httpd" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.962695 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" containerName="sg-core" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.964559 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.970119 4618 scope.go:117] "RemoveContainer" containerID="4523d5223dc263eb1295a3ff5c472317f398a6a5f2b599ae79353c61ef0d36c1" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.970656 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.971522 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.971651 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:19:49 crc kubenswrapper[4618]: I0121 09:19:49.972468 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.094992 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6ht4\" (UniqueName: \"kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095182 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095222 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095265 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095392 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095450 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095528 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.095600 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197077 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197119 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197157 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197193 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197212 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197242 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197290 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.197320 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6ht4\" (UniqueName: \"kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.198514 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.198746 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.201242 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.201246 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.201404 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.201903 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.202504 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.214287 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6ht4\" (UniqueName: \"kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4\") pod \"ceilometer-0\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.279672 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.676857 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:19:50 crc kubenswrapper[4618]: W0121 09:19:50.687895 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod963186b6_ebce_48d8_8e25_36c695cee351.slice/crio-e31bf467f8d93c05449feda295472b8b1d94217375f1ba84229ca0419cdc26de WatchSource:0}: Error finding container e31bf467f8d93c05449feda295472b8b1d94217375f1ba84229ca0419cdc26de: Status 404 returned error can't find the container with id e31bf467f8d93c05449feda295472b8b1d94217375f1ba84229ca0419cdc26de Jan 21 09:19:50 crc kubenswrapper[4618]: I0121 09:19:50.892876 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerStarted","Data":"e31bf467f8d93c05449feda295472b8b1d94217375f1ba84229ca0419cdc26de"} Jan 21 09:19:51 crc kubenswrapper[4618]: I0121 09:19:51.547208 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290b1aba-2b06-40f3-87a2-3cb601cf9b63" path="/var/lib/kubelet/pods/290b1aba-2b06-40f3-87a2-3cb601cf9b63/volumes" Jan 21 09:19:51 crc kubenswrapper[4618]: I0121 09:19:51.900588 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerStarted","Data":"fd4775994f261f1543cc3ef4d4ab1c6349da0abbee527e7d707780fb541ce53b"} Jan 21 09:19:52 crc kubenswrapper[4618]: I0121 09:19:52.909516 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerStarted","Data":"8556a0e1a4f75152da5542599b05f270d7e7a8c6414427c95c8543069e7d0b8f"} Jan 21 09:19:52 crc kubenswrapper[4618]: I0121 09:19:52.909573 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerStarted","Data":"13b8fc9f19f652ea23485a00955662e207dd7cd5307bf487db73ef156e003851"} Jan 21 09:19:54 crc kubenswrapper[4618]: I0121 09:19:54.935552 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerStarted","Data":"c9ad8aacd8252338c28318b975f968ea17f799660546b6e8c06e9d3782f664b5"} Jan 21 09:19:54 crc kubenswrapper[4618]: I0121 09:19:54.936294 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:19:54 crc kubenswrapper[4618]: I0121 09:19:54.965053 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.876628425 podStartE2EDuration="5.965036356s" podCreationTimestamp="2026-01-21 09:19:49 +0000 UTC" firstStartedPulling="2026-01-21 09:19:50.690387995 +0000 UTC m=+989.440855312" lastFinishedPulling="2026-01-21 09:19:53.778795926 +0000 UTC m=+992.529263243" observedRunningTime="2026-01-21 09:19:54.955465785 +0000 UTC m=+993.705933103" watchObservedRunningTime="2026-01-21 09:19:54.965036356 +0000 UTC m=+993.715503673" Jan 21 09:19:55 crc kubenswrapper[4618]: I0121 09:19:55.955122 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" containerID="d1cdfd5ad8454b9f8817f40bb774fc0ea3c8449535026c0c00aaf0aa372de33a" exitCode=0 Jan 21 09:19:55 crc kubenswrapper[4618]: I0121 09:19:55.957105 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-th4n8" event={"ID":"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc","Type":"ContainerDied","Data":"d1cdfd5ad8454b9f8817f40bb774fc0ea3c8449535026c0c00aaf0aa372de33a"} Jan 21 09:19:56 crc kubenswrapper[4618]: I0121 09:19:56.958893 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:19:56 crc kubenswrapper[4618]: I0121 09:19:56.959286 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.257285 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.334294 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data\") pod \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.334405 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svv92\" (UniqueName: \"kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92\") pod \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.334496 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle\") pod \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.340757 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92" (OuterVolumeSpecName: "kube-api-access-svv92") pod "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" (UID: "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc"). InnerVolumeSpecName "kube-api-access-svv92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.358691 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" (UID: "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.359726 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data" (OuterVolumeSpecName: "config-data") pod "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" (UID: "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.436225 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts\") pod \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\" (UID: \"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc\") " Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.436991 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.437015 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svv92\" (UniqueName: \"kubernetes.io/projected/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-kube-api-access-svv92\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.437027 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.438983 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts" (OuterVolumeSpecName: "scripts") pod "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" (UID: "1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.539125 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.978496 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-th4n8" event={"ID":"1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc","Type":"ContainerDied","Data":"9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c"} Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.978546 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6f8ff705ffe2174c87cb29de646bdb710da0e2d4516e584e0aeae0d74ead8c" Jan 21 09:19:57 crc kubenswrapper[4618]: I0121 09:19:57.978629 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-th4n8" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.065934 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 09:19:58 crc kubenswrapper[4618]: E0121 09:19:58.068192 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" containerName="nova-cell0-conductor-db-sync" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.068222 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" containerName="nova-cell0-conductor-db-sync" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.068487 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" containerName="nova-cell0-conductor-db-sync" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.069684 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.072731 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5kswc" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.073239 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.077658 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.149925 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56kt\" (UniqueName: \"kubernetes.io/projected/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-kube-api-access-s56kt\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.150563 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.150871 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.253647 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.253738 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56kt\" (UniqueName: \"kubernetes.io/projected/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-kube-api-access-s56kt\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.253798 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.260455 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.260540 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.270461 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56kt\" (UniqueName: \"kubernetes.io/projected/e717fabb-c7f8-4c12-a063-e9a5b0d2a671-kube-api-access-s56kt\") pod \"nova-cell0-conductor-0\" (UID: \"e717fabb-c7f8-4c12-a063-e9a5b0d2a671\") " pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.392931 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.786757 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 09:19:58 crc kubenswrapper[4618]: W0121 09:19:58.791763 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode717fabb_c7f8_4c12_a063_e9a5b0d2a671.slice/crio-cf955b5e8ddf0b81ff435dfe66261bcb8484591b5deef6721395cd03c1a8235b WatchSource:0}: Error finding container cf955b5e8ddf0b81ff435dfe66261bcb8484591b5deef6721395cd03c1a8235b: Status 404 returned error can't find the container with id cf955b5e8ddf0b81ff435dfe66261bcb8484591b5deef6721395cd03c1a8235b Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.989258 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e717fabb-c7f8-4c12-a063-e9a5b0d2a671","Type":"ContainerStarted","Data":"419c8db74acda1d6b5a6a374d6b40928a158451d9c00ddf5c36d94ec924c818f"} Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.989316 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e717fabb-c7f8-4c12-a063-e9a5b0d2a671","Type":"ContainerStarted","Data":"cf955b5e8ddf0b81ff435dfe66261bcb8484591b5deef6721395cd03c1a8235b"} Jan 21 09:19:58 crc kubenswrapper[4618]: I0121 09:19:58.989408 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 21 09:19:59 crc kubenswrapper[4618]: I0121 09:19:59.005019 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.005000524 podStartE2EDuration="1.005000524s" podCreationTimestamp="2026-01-21 09:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:19:59.002800457 +0000 UTC m=+997.753267774" watchObservedRunningTime="2026-01-21 09:19:59.005000524 +0000 UTC m=+997.755467841" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.418727 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.911898 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gl84v"] Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.914444 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.917595 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.917619 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.924314 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gl84v"] Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.976758 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.977240 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.977298 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:03 crc kubenswrapper[4618]: I0121 09:20:03.977456 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.018498 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.019831 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.022030 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.035586 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081331 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081435 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081468 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081496 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081513 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l58h\" (UniqueName: \"kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081529 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.081571 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.087225 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.087613 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.098751 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.109082 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv\") pod \"nova-cell0-cell-mapping-gl84v\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.135417 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.138335 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.141050 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.170243 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.172493 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.175464 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.189329 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.189651 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.189789 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.189875 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l58h\" (UniqueName: \"kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.189966 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.190087 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.190199 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.192075 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntlb4\" (UniqueName: \"kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.192329 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxch\" (UniqueName: \"kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.192359 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.192378 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.195216 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.196341 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.197522 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.211084 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.232067 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l58h\" (UniqueName: \"kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h\") pod \"nova-scheduler-0\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.251175 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.267564 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.268905 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.277502 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.284446 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.293955 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.295440 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.300872 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxch\" (UniqueName: \"kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.300920 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.300947 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301062 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301084 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301118 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301215 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301243 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301282 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52kzz\" (UniqueName: \"kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301374 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.301500 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntlb4\" (UniqueName: \"kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.302412 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.306554 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.308696 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.310761 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.312172 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.308558 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.326402 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.331647 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntlb4\" (UniqueName: \"kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4\") pod \"nova-metadata-0\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.332204 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxch\" (UniqueName: \"kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch\") pod \"nova-api-0\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.342421 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404339 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404479 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404577 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404624 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404658 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pg9m\" (UniqueName: \"kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404723 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52kzz\" (UniqueName: \"kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404756 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404784 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.404829 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.408277 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.408827 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.424189 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52kzz\" (UniqueName: \"kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.488500 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.497745 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506353 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506394 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pg9m\" (UniqueName: \"kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506450 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506488 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506549 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.506609 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.507359 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.507864 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.508610 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.509092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.509623 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.525693 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pg9m\" (UniqueName: \"kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m\") pod \"dnsmasq-dns-647df7b8c5-wjw8p\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.661760 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.672848 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.810034 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gl84v"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.928379 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.939160 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mzrwt"] Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.940431 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.942647 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.942759 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 09:20:04 crc kubenswrapper[4618]: I0121 09:20:04.982559 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mzrwt"] Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.020599 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.020784 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stfd\" (UniqueName: \"kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.020815 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.020848 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.040677 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:05 crc kubenswrapper[4618]: W0121 09:20:05.065298 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f3e2b88_be74_4b59_a2d5_c12132a661d3.slice/crio-78fd9c9682c2e2ddb5f993ff1d8946e58fe4fdb6da71ce6ba041de571241604e WatchSource:0}: Error finding container 78fd9c9682c2e2ddb5f993ff1d8946e58fe4fdb6da71ce6ba041de571241604e: Status 404 returned error can't find the container with id 78fd9c9682c2e2ddb5f993ff1d8946e58fe4fdb6da71ce6ba041de571241604e Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.066866 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.076123 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gl84v" event={"ID":"f3fa4156-22b0-45b2-b300-e7a4b768964a","Type":"ContainerStarted","Data":"4b2ac550cd82c2011c2aff516a4b46f15c57aa85c4fd4772bf7a789dfd4ea162"} Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.076180 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gl84v" event={"ID":"f3fa4156-22b0-45b2-b300-e7a4b768964a","Type":"ContainerStarted","Data":"e11b388e5f7eb5919fde33a1c329a9ae5be3ab7f21116aad95887752955534bf"} Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.080369 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"537b32bd-fcec-4a2d-8adf-fbe3d145a95d","Type":"ContainerStarted","Data":"bca99678dc84a03d669d0270a2d733f83add07b0f2c097ca10bc575cb95f5e0d"} Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.099540 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gl84v" podStartSLOduration=2.099523009 podStartE2EDuration="2.099523009s" podCreationTimestamp="2026-01-21 09:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:05.08935742 +0000 UTC m=+1003.839824737" watchObservedRunningTime="2026-01-21 09:20:05.099523009 +0000 UTC m=+1003.849990326" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.122709 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4stfd\" (UniqueName: \"kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.122762 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.122799 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.122995 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.128409 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.130223 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.133280 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.136718 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4stfd\" (UniqueName: \"kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd\") pod \"nova-cell1-conductor-db-sync-mzrwt\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.204414 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.214131 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.269590 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:05 crc kubenswrapper[4618]: I0121 09:20:05.692521 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mzrwt"] Jan 21 09:20:05 crc kubenswrapper[4618]: W0121 09:20:05.696778 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod180187c8_6cea_49da_86c8_b0709d20403d.slice/crio-8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f WatchSource:0}: Error finding container 8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f: Status 404 returned error can't find the container with id 8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.134867 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerStarted","Data":"78fd9c9682c2e2ddb5f993ff1d8946e58fe4fdb6da71ce6ba041de571241604e"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.137920 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bc65221-2632-4f13-94fb-54dd961ee4d3","Type":"ContainerStarted","Data":"f38ad41e1b233c9d971b8c3367ab5e88f93e7041ae80ba9eba72064b4d46000c"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.139575 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerStarted","Data":"18a11fd0a5ee62f90b682f035f459e341c96fc04df5a1917758f12f7d0605a66"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.141357 4618 generic.go:334] "Generic (PLEG): container finished" podID="cfcbb487-fc93-4137-99fa-32e268de295a" containerID="7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652" exitCode=0 Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.141405 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" event={"ID":"cfcbb487-fc93-4137-99fa-32e268de295a","Type":"ContainerDied","Data":"7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.141422 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" event={"ID":"cfcbb487-fc93-4137-99fa-32e268de295a","Type":"ContainerStarted","Data":"8bae8bbfcc8125ad08ad402f697c2b07fb0e8f753f80ebadaf38dab4da0e0aa9"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.148236 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" event={"ID":"180187c8-6cea-49da-86c8-b0709d20403d","Type":"ContainerStarted","Data":"172bfda592ead090abe21a7d722410a375184f3b8ca826356e0689144a64e370"} Jan 21 09:20:06 crc kubenswrapper[4618]: I0121 09:20:06.148547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" event={"ID":"180187c8-6cea-49da-86c8-b0709d20403d","Type":"ContainerStarted","Data":"8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f"} Jan 21 09:20:07 crc kubenswrapper[4618]: I0121 09:20:07.167858 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" event={"ID":"cfcbb487-fc93-4137-99fa-32e268de295a","Type":"ContainerStarted","Data":"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa"} Jan 21 09:20:07 crc kubenswrapper[4618]: I0121 09:20:07.182927 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" podStartSLOduration=3.182902692 podStartE2EDuration="3.182902692s" podCreationTimestamp="2026-01-21 09:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:06.193358275 +0000 UTC m=+1004.943825592" watchObservedRunningTime="2026-01-21 09:20:07.182902692 +0000 UTC m=+1005.933369999" Jan 21 09:20:07 crc kubenswrapper[4618]: I0121 09:20:07.185615 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:07 crc kubenswrapper[4618]: I0121 09:20:07.195768 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:07 crc kubenswrapper[4618]: I0121 09:20:07.196711 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" podStartSLOduration=3.196697547 podStartE2EDuration="3.196697547s" podCreationTimestamp="2026-01-21 09:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:07.18728853 +0000 UTC m=+1005.937755847" watchObservedRunningTime="2026-01-21 09:20:07.196697547 +0000 UTC m=+1005.947164864" Jan 21 09:20:08 crc kubenswrapper[4618]: I0121 09:20:08.176037 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.192246 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerStarted","Data":"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.192632 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerStarted","Data":"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.195378 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerStarted","Data":"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.195449 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerStarted","Data":"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.195476 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-log" containerID="cri-o://12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" gracePeriod=30 Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.195525 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-metadata" containerID="cri-o://3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" gracePeriod=30 Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.198185 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bc65221-2632-4f13-94fb-54dd961ee4d3","Type":"ContainerStarted","Data":"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.198466 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9bc65221-2632-4f13-94fb-54dd961ee4d3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9" gracePeriod=30 Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.204429 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"537b32bd-fcec-4a2d-8adf-fbe3d145a95d","Type":"ContainerStarted","Data":"3e2d834ff9dc0893d333058d6573f7d9b5b3eddb14271e82d8d8731e775d7fea"} Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.214538 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.31597951 podStartE2EDuration="5.214518176s" podCreationTimestamp="2026-01-21 09:20:04 +0000 UTC" firstStartedPulling="2026-01-21 09:20:05.071957465 +0000 UTC m=+1003.822424782" lastFinishedPulling="2026-01-21 09:20:07.970496131 +0000 UTC m=+1006.720963448" observedRunningTime="2026-01-21 09:20:09.210664879 +0000 UTC m=+1007.961132196" watchObservedRunningTime="2026-01-21 09:20:09.214518176 +0000 UTC m=+1007.964985493" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.235278 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.460390849 podStartE2EDuration="5.235259737s" podCreationTimestamp="2026-01-21 09:20:04 +0000 UTC" firstStartedPulling="2026-01-21 09:20:05.206901185 +0000 UTC m=+1003.957368503" lastFinishedPulling="2026-01-21 09:20:07.981770085 +0000 UTC m=+1006.732237391" observedRunningTime="2026-01-21 09:20:09.23040167 +0000 UTC m=+1007.980868988" watchObservedRunningTime="2026-01-21 09:20:09.235259737 +0000 UTC m=+1007.985727044" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.246988 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.211502631 podStartE2EDuration="6.246973398s" podCreationTimestamp="2026-01-21 09:20:03 +0000 UTC" firstStartedPulling="2026-01-21 09:20:04.939257263 +0000 UTC m=+1003.689724580" lastFinishedPulling="2026-01-21 09:20:07.97472803 +0000 UTC m=+1006.725195347" observedRunningTime="2026-01-21 09:20:09.242424082 +0000 UTC m=+1007.992891399" watchObservedRunningTime="2026-01-21 09:20:09.246973398 +0000 UTC m=+1007.997440715" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.261601 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.365477033 podStartE2EDuration="5.261574118s" podCreationTimestamp="2026-01-21 09:20:04 +0000 UTC" firstStartedPulling="2026-01-21 09:20:05.084501488 +0000 UTC m=+1003.834968804" lastFinishedPulling="2026-01-21 09:20:07.980598572 +0000 UTC m=+1006.731065889" observedRunningTime="2026-01-21 09:20:09.255419222 +0000 UTC m=+1008.005886539" watchObservedRunningTime="2026-01-21 09:20:09.261574118 +0000 UTC m=+1008.012041426" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.344749 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.490284 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.490348 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.662153 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.693589 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.742864 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs\") pod \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.743086 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle\") pod \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.743387 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data\") pod \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.743522 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntlb4\" (UniqueName: \"kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4\") pod \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\" (UID: \"9f3e2b88-be74-4b59-a2d5-c12132a661d3\") " Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.745256 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs" (OuterVolumeSpecName: "logs") pod "9f3e2b88-be74-4b59-a2d5-c12132a661d3" (UID: "9f3e2b88-be74-4b59-a2d5-c12132a661d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.749290 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4" (OuterVolumeSpecName: "kube-api-access-ntlb4") pod "9f3e2b88-be74-4b59-a2d5-c12132a661d3" (UID: "9f3e2b88-be74-4b59-a2d5-c12132a661d3"). InnerVolumeSpecName "kube-api-access-ntlb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.764612 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f3e2b88-be74-4b59-a2d5-c12132a661d3" (UID: "9f3e2b88-be74-4b59-a2d5-c12132a661d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.767975 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data" (OuterVolumeSpecName: "config-data") pod "9f3e2b88-be74-4b59-a2d5-c12132a661d3" (UID: "9f3e2b88-be74-4b59-a2d5-c12132a661d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.845773 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.845808 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntlb4\" (UniqueName: \"kubernetes.io/projected/9f3e2b88-be74-4b59-a2d5-c12132a661d3-kube-api-access-ntlb4\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.845823 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3e2b88-be74-4b59-a2d5-c12132a661d3-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:09 crc kubenswrapper[4618]: I0121 09:20:09.845835 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3e2b88-be74-4b59-a2d5-c12132a661d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213668 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerID="3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" exitCode=0 Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213702 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerID="12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" exitCode=143 Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213750 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerDied","Data":"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc"} Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213782 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerDied","Data":"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998"} Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213793 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3e2b88-be74-4b59-a2d5-c12132a661d3","Type":"ContainerDied","Data":"78fd9c9682c2e2ddb5f993ff1d8946e58fe4fdb6da71ce6ba041de571241604e"} Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213809 4618 scope.go:117] "RemoveContainer" containerID="3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.213918 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.220628 4618 generic.go:334] "Generic (PLEG): container finished" podID="180187c8-6cea-49da-86c8-b0709d20403d" containerID="172bfda592ead090abe21a7d722410a375184f3b8ca826356e0689144a64e370" exitCode=0 Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.221786 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" event={"ID":"180187c8-6cea-49da-86c8-b0709d20403d","Type":"ContainerDied","Data":"172bfda592ead090abe21a7d722410a375184f3b8ca826356e0689144a64e370"} Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.238265 4618 scope.go:117] "RemoveContainer" containerID="12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.267731 4618 scope.go:117] "RemoveContainer" containerID="3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" Jan 21 09:20:10 crc kubenswrapper[4618]: E0121 09:20:10.269766 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc\": container with ID starting with 3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc not found: ID does not exist" containerID="3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.269808 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc"} err="failed to get container status \"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc\": rpc error: code = NotFound desc = could not find container \"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc\": container with ID starting with 3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc not found: ID does not exist" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.269832 4618 scope.go:117] "RemoveContainer" containerID="12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" Jan 21 09:20:10 crc kubenswrapper[4618]: E0121 09:20:10.270271 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998\": container with ID starting with 12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998 not found: ID does not exist" containerID="12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.270323 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998"} err="failed to get container status \"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998\": rpc error: code = NotFound desc = could not find container \"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998\": container with ID starting with 12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998 not found: ID does not exist" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.270359 4618 scope.go:117] "RemoveContainer" containerID="3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.270760 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc"} err="failed to get container status \"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc\": rpc error: code = NotFound desc = could not find container \"3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc\": container with ID starting with 3b50f305984510b2c58f370a80c204d5436e3c2cd23f8155bd83c7ccb7e3ddfc not found: ID does not exist" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.270788 4618 scope.go:117] "RemoveContainer" containerID="12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.271102 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998"} err="failed to get container status \"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998\": rpc error: code = NotFound desc = could not find container \"12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998\": container with ID starting with 12ce4eb014ea4cf5b7f0b8f49606798fe52dcef0baafc1e33ab9d9ba91e09998 not found: ID does not exist" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.280546 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.291210 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.296252 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:10 crc kubenswrapper[4618]: E0121 09:20:10.296750 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-log" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.296771 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-log" Jan 21 09:20:10 crc kubenswrapper[4618]: E0121 09:20:10.296825 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-metadata" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.296832 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-metadata" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.296989 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-metadata" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.297019 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" containerName="nova-metadata-log" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.297988 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.302303 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.302371 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.305441 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.354419 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.354565 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.354647 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wq7\" (UniqueName: \"kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.354759 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.354896 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.456458 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.456735 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.456834 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.456906 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wq7\" (UniqueName: \"kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.457001 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.457512 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.463560 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.466595 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.473116 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.473317 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wq7\" (UniqueName: \"kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7\") pod \"nova-metadata-0\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " pod="openstack/nova-metadata-0" Jan 21 09:20:10 crc kubenswrapper[4618]: I0121 09:20:10.616346 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.034353 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:11 crc kubenswrapper[4618]: W0121 09:20:11.048620 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode86e9c9c_e52d_4d68_b713_b9793049ceee.slice/crio-98c2d9cf5fe00ebcc59c9cecf1527aebc4d6a512adba5231c17e1a779a87024a WatchSource:0}: Error finding container 98c2d9cf5fe00ebcc59c9cecf1527aebc4d6a512adba5231c17e1a779a87024a: Status 404 returned error can't find the container with id 98c2d9cf5fe00ebcc59c9cecf1527aebc4d6a512adba5231c17e1a779a87024a Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.241768 4618 generic.go:334] "Generic (PLEG): container finished" podID="f3fa4156-22b0-45b2-b300-e7a4b768964a" containerID="4b2ac550cd82c2011c2aff516a4b46f15c57aa85c4fd4772bf7a789dfd4ea162" exitCode=0 Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.242021 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gl84v" event={"ID":"f3fa4156-22b0-45b2-b300-e7a4b768964a","Type":"ContainerDied","Data":"4b2ac550cd82c2011c2aff516a4b46f15c57aa85c4fd4772bf7a789dfd4ea162"} Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.258053 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerStarted","Data":"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35"} Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.258101 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerStarted","Data":"98c2d9cf5fe00ebcc59c9cecf1527aebc4d6a512adba5231c17e1a779a87024a"} Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.515616 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.568604 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3e2b88-be74-4b59-a2d5-c12132a661d3" path="/var/lib/kubelet/pods/9f3e2b88-be74-4b59-a2d5-c12132a661d3/volumes" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.581922 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts\") pod \"180187c8-6cea-49da-86c8-b0709d20403d\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.582566 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4stfd\" (UniqueName: \"kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd\") pod \"180187c8-6cea-49da-86c8-b0709d20403d\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.582733 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle\") pod \"180187c8-6cea-49da-86c8-b0709d20403d\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.582783 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data\") pod \"180187c8-6cea-49da-86c8-b0709d20403d\" (UID: \"180187c8-6cea-49da-86c8-b0709d20403d\") " Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.591707 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts" (OuterVolumeSpecName: "scripts") pod "180187c8-6cea-49da-86c8-b0709d20403d" (UID: "180187c8-6cea-49da-86c8-b0709d20403d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.591750 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd" (OuterVolumeSpecName: "kube-api-access-4stfd") pod "180187c8-6cea-49da-86c8-b0709d20403d" (UID: "180187c8-6cea-49da-86c8-b0709d20403d"). InnerVolumeSpecName "kube-api-access-4stfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.605320 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180187c8-6cea-49da-86c8-b0709d20403d" (UID: "180187c8-6cea-49da-86c8-b0709d20403d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.607940 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data" (OuterVolumeSpecName: "config-data") pod "180187c8-6cea-49da-86c8-b0709d20403d" (UID: "180187c8-6cea-49da-86c8-b0709d20403d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.685834 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.685877 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4stfd\" (UniqueName: \"kubernetes.io/projected/180187c8-6cea-49da-86c8-b0709d20403d-kube-api-access-4stfd\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.685894 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:11 crc kubenswrapper[4618]: I0121 09:20:11.685906 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180187c8-6cea-49da-86c8-b0709d20403d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.270740 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" event={"ID":"180187c8-6cea-49da-86c8-b0709d20403d","Type":"ContainerDied","Data":"8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f"} Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.271104 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9abae303e33ef24fa07b779b3789c76cf6678e14657913f0330d1220d9a67f" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.270785 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mzrwt" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.274798 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerStarted","Data":"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117"} Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.304015 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.303990067 podStartE2EDuration="2.303990067s" podCreationTimestamp="2026-01-21 09:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:12.298822228 +0000 UTC m=+1011.049289546" watchObservedRunningTime="2026-01-21 09:20:12.303990067 +0000 UTC m=+1011.054457375" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.327463 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 09:20:12 crc kubenswrapper[4618]: E0121 09:20:12.331221 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180187c8-6cea-49da-86c8-b0709d20403d" containerName="nova-cell1-conductor-db-sync" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.331251 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="180187c8-6cea-49da-86c8-b0709d20403d" containerName="nova-cell1-conductor-db-sync" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.331461 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="180187c8-6cea-49da-86c8-b0709d20403d" containerName="nova-cell1-conductor-db-sync" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.332159 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.333867 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.336465 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.400293 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7p52\" (UniqueName: \"kubernetes.io/projected/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-kube-api-access-v7p52\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.400375 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.400614 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.503501 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.503813 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7p52\" (UniqueName: \"kubernetes.io/projected/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-kube-api-access-v7p52\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.503857 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.509261 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.510390 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.521258 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7p52\" (UniqueName: \"kubernetes.io/projected/7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198-kube-api-access-v7p52\") pod \"nova-cell1-conductor-0\" (UID: \"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198\") " pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.593745 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.652030 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.707985 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts\") pod \"f3fa4156-22b0-45b2-b300-e7a4b768964a\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.708433 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle\") pod \"f3fa4156-22b0-45b2-b300-e7a4b768964a\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.708787 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv\") pod \"f3fa4156-22b0-45b2-b300-e7a4b768964a\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.708945 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data\") pod \"f3fa4156-22b0-45b2-b300-e7a4b768964a\" (UID: \"f3fa4156-22b0-45b2-b300-e7a4b768964a\") " Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.717164 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts" (OuterVolumeSpecName: "scripts") pod "f3fa4156-22b0-45b2-b300-e7a4b768964a" (UID: "f3fa4156-22b0-45b2-b300-e7a4b768964a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.717256 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv" (OuterVolumeSpecName: "kube-api-access-hlfvv") pod "f3fa4156-22b0-45b2-b300-e7a4b768964a" (UID: "f3fa4156-22b0-45b2-b300-e7a4b768964a"). InnerVolumeSpecName "kube-api-access-hlfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.730266 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3fa4156-22b0-45b2-b300-e7a4b768964a" (UID: "f3fa4156-22b0-45b2-b300-e7a4b768964a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.732572 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data" (OuterVolumeSpecName: "config-data") pod "f3fa4156-22b0-45b2-b300-e7a4b768964a" (UID: "f3fa4156-22b0-45b2-b300-e7a4b768964a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.811187 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/f3fa4156-22b0-45b2-b300-e7a4b768964a-kube-api-access-hlfvv\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.811218 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.811228 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:12 crc kubenswrapper[4618]: I0121 09:20:12.811238 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fa4156-22b0-45b2-b300-e7a4b768964a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:13 crc kubenswrapper[4618]: W0121 09:20:13.066119 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd5d2fd_6aeb_4bec_88ce_4d6ae0887198.slice/crio-8bf8070347deec1dde97bf323083215e1b40776bfc9fec5c17ed5344137257fe WatchSource:0}: Error finding container 8bf8070347deec1dde97bf323083215e1b40776bfc9fec5c17ed5344137257fe: Status 404 returned error can't find the container with id 8bf8070347deec1dde97bf323083215e1b40776bfc9fec5c17ed5344137257fe Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.066735 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.286591 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gl84v" Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.286587 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gl84v" event={"ID":"f3fa4156-22b0-45b2-b300-e7a4b768964a","Type":"ContainerDied","Data":"e11b388e5f7eb5919fde33a1c329a9ae5be3ab7f21116aad95887752955534bf"} Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.286659 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11b388e5f7eb5919fde33a1c329a9ae5be3ab7f21116aad95887752955534bf" Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.289919 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198","Type":"ContainerStarted","Data":"dac811a8037630a2cf6083d4d67d594a7f8437385cef60cb183caf964425a8c8"} Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.289970 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198","Type":"ContainerStarted","Data":"8bf8070347deec1dde97bf323083215e1b40776bfc9fec5c17ed5344137257fe"} Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.290011 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.332275 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.3322432069999999 podStartE2EDuration="1.332243207s" podCreationTimestamp="2026-01-21 09:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:13.319029205 +0000 UTC m=+1012.069496522" watchObservedRunningTime="2026-01-21 09:20:13.332243207 +0000 UTC m=+1012.082710524" Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.513289 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.513531 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" containerName="nova-scheduler-scheduler" containerID="cri-o://3e2d834ff9dc0893d333058d6573f7d9b5b3eddb14271e82d8d8731e775d7fea" gracePeriod=30 Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.520281 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.520478 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-log" containerID="cri-o://a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" gracePeriod=30 Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.520559 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-api" containerID="cri-o://d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" gracePeriod=30 Jan 21 09:20:13 crc kubenswrapper[4618]: I0121 09:20:13.563252 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.002751 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.056981 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs\") pod \"140504cb-90bd-4e75-8718-11a596f15f72\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.057092 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxch\" (UniqueName: \"kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch\") pod \"140504cb-90bd-4e75-8718-11a596f15f72\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.057167 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data\") pod \"140504cb-90bd-4e75-8718-11a596f15f72\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.057198 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle\") pod \"140504cb-90bd-4e75-8718-11a596f15f72\" (UID: \"140504cb-90bd-4e75-8718-11a596f15f72\") " Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.057392 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs" (OuterVolumeSpecName: "logs") pod "140504cb-90bd-4e75-8718-11a596f15f72" (UID: "140504cb-90bd-4e75-8718-11a596f15f72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.057702 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140504cb-90bd-4e75-8718-11a596f15f72-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.062786 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch" (OuterVolumeSpecName: "kube-api-access-8gxch") pod "140504cb-90bd-4e75-8718-11a596f15f72" (UID: "140504cb-90bd-4e75-8718-11a596f15f72"). InnerVolumeSpecName "kube-api-access-8gxch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.082891 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data" (OuterVolumeSpecName: "config-data") pod "140504cb-90bd-4e75-8718-11a596f15f72" (UID: "140504cb-90bd-4e75-8718-11a596f15f72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.085090 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "140504cb-90bd-4e75-8718-11a596f15f72" (UID: "140504cb-90bd-4e75-8718-11a596f15f72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.160497 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxch\" (UniqueName: \"kubernetes.io/projected/140504cb-90bd-4e75-8718-11a596f15f72-kube-api-access-8gxch\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.160538 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.160550 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140504cb-90bd-4e75-8718-11a596f15f72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299830 4618 generic.go:334] "Generic (PLEG): container finished" podID="140504cb-90bd-4e75-8718-11a596f15f72" containerID="d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" exitCode=0 Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299876 4618 generic.go:334] "Generic (PLEG): container finished" podID="140504cb-90bd-4e75-8718-11a596f15f72" containerID="a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" exitCode=143 Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299874 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerDied","Data":"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338"} Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299918 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299940 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerDied","Data":"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1"} Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299958 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"140504cb-90bd-4e75-8718-11a596f15f72","Type":"ContainerDied","Data":"18a11fd0a5ee62f90b682f035f459e341c96fc04df5a1917758f12f7d0605a66"} Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.299980 4618 scope.go:117] "RemoveContainer" containerID="d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.300349 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-log" containerID="cri-o://df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" gracePeriod=30 Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.300557 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-metadata" containerID="cri-o://d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" gracePeriod=30 Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.342624 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.344882 4618 scope.go:117] "RemoveContainer" containerID="a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.355501 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.376549 4618 scope.go:117] "RemoveContainer" containerID="d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.377307 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:14 crc kubenswrapper[4618]: E0121 09:20:14.377420 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338\": container with ID starting with d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338 not found: ID does not exist" containerID="d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.377494 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338"} err="failed to get container status \"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338\": rpc error: code = NotFound desc = could not find container \"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338\": container with ID starting with d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338 not found: ID does not exist" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.377556 4618 scope.go:117] "RemoveContainer" containerID="a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" Jan 21 09:20:14 crc kubenswrapper[4618]: E0121 09:20:14.378078 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-api" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378111 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-api" Jan 21 09:20:14 crc kubenswrapper[4618]: E0121 09:20:14.378161 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-log" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378171 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-log" Jan 21 09:20:14 crc kubenswrapper[4618]: E0121 09:20:14.378182 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fa4156-22b0-45b2-b300-e7a4b768964a" containerName="nova-manage" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378189 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fa4156-22b0-45b2-b300-e7a4b768964a" containerName="nova-manage" Jan 21 09:20:14 crc kubenswrapper[4618]: E0121 09:20:14.378360 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1\": container with ID starting with a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1 not found: ID does not exist" containerID="a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378429 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1"} err="failed to get container status \"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1\": rpc error: code = NotFound desc = could not find container \"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1\": container with ID starting with a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1 not found: ID does not exist" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378481 4618 scope.go:117] "RemoveContainer" containerID="d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378515 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fa4156-22b0-45b2-b300-e7a4b768964a" containerName="nova-manage" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378533 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-api" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378546 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="140504cb-90bd-4e75-8718-11a596f15f72" containerName="nova-api-log" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378946 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338"} err="failed to get container status \"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338\": rpc error: code = NotFound desc = could not find container \"d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338\": container with ID starting with d54d6455c00b3d407126f5e3ca2b75223ace17e4b852c8585ecf87eccb0d0338 not found: ID does not exist" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.378993 4618 scope.go:117] "RemoveContainer" containerID="a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.379511 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1"} err="failed to get container status \"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1\": rpc error: code = NotFound desc = could not find container \"a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1\": container with ID starting with a3930f4e0fc4e6776c67452c6174e9ea4f095435fa65382b3426c600c6f5d9a1 not found: ID does not exist" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.380061 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.382441 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.391037 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.468854 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.469153 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgxg\" (UniqueName: \"kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.469225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.469595 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.571666 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgxg\" (UniqueName: \"kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.571714 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.571872 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.572019 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.572346 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.576663 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.576733 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.585488 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgxg\" (UniqueName: \"kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg\") pod \"nova-api-0\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.675212 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.708206 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.723172 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.723573 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="dnsmasq-dns" containerID="cri-o://b4030d3306f46a34332c6f7410af9b79ae2faab4495a8e8cc91750cb850a5769" gracePeriod=10 Jan 21 09:20:14 crc kubenswrapper[4618]: I0121 09:20:14.883242 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.084591 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs\") pod \"e86e9c9c-e52d-4d68-b713-b9793049ceee\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.084655 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs\") pod \"e86e9c9c-e52d-4d68-b713-b9793049ceee\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.084695 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data\") pod \"e86e9c9c-e52d-4d68-b713-b9793049ceee\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.084773 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wq7\" (UniqueName: \"kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7\") pod \"e86e9c9c-e52d-4d68-b713-b9793049ceee\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.084921 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle\") pod \"e86e9c9c-e52d-4d68-b713-b9793049ceee\" (UID: \"e86e9c9c-e52d-4d68-b713-b9793049ceee\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.085537 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs" (OuterVolumeSpecName: "logs") pod "e86e9c9c-e52d-4d68-b713-b9793049ceee" (UID: "e86e9c9c-e52d-4d68-b713-b9793049ceee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.093410 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7" (OuterVolumeSpecName: "kube-api-access-j7wq7") pod "e86e9c9c-e52d-4d68-b713-b9793049ceee" (UID: "e86e9c9c-e52d-4d68-b713-b9793049ceee"). InnerVolumeSpecName "kube-api-access-j7wq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.174335 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e86e9c9c-e52d-4d68-b713-b9793049ceee" (UID: "e86e9c9c-e52d-4d68-b713-b9793049ceee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.183275 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data" (OuterVolumeSpecName: "config-data") pod "e86e9c9c-e52d-4d68-b713-b9793049ceee" (UID: "e86e9c9c-e52d-4d68-b713-b9793049ceee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.187987 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.188024 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e86e9c9c-e52d-4d68-b713-b9793049ceee-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.188087 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.188104 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wq7\" (UniqueName: \"kubernetes.io/projected/e86e9c9c-e52d-4d68-b713-b9793049ceee-kube-api-access-j7wq7\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.197409 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.208182 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e86e9c9c-e52d-4d68-b713-b9793049ceee" (UID: "e86e9c9c-e52d-4d68-b713-b9793049ceee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.289158 4618 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86e9c9c-e52d-4d68-b713-b9793049ceee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.317716 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerStarted","Data":"d64bacbd8562fc480f4c514f7262ff8e8ca84d0f98354aa2b5e94e82b65164ae"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.319922 4618 generic.go:334] "Generic (PLEG): container finished" podID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerID="d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" exitCode=0 Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.320035 4618 generic.go:334] "Generic (PLEG): container finished" podID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerID="df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" exitCode=143 Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.319980 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerDied","Data":"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.320192 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerDied","Data":"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.320221 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e86e9c9c-e52d-4d68-b713-b9793049ceee","Type":"ContainerDied","Data":"98c2d9cf5fe00ebcc59c9cecf1527aebc4d6a512adba5231c17e1a779a87024a"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.319999 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.320294 4618 scope.go:117] "RemoveContainer" containerID="d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.323261 4618 generic.go:334] "Generic (PLEG): container finished" podID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" containerID="3e2d834ff9dc0893d333058d6573f7d9b5b3eddb14271e82d8d8731e775d7fea" exitCode=0 Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.323328 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"537b32bd-fcec-4a2d-8adf-fbe3d145a95d","Type":"ContainerDied","Data":"3e2d834ff9dc0893d333058d6573f7d9b5b3eddb14271e82d8d8731e775d7fea"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.323447 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"537b32bd-fcec-4a2d-8adf-fbe3d145a95d","Type":"ContainerDied","Data":"bca99678dc84a03d669d0270a2d733f83add07b0f2c097ca10bc575cb95f5e0d"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.323506 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca99678dc84a03d669d0270a2d733f83add07b0f2c097ca10bc575cb95f5e0d" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.324998 4618 generic.go:334] "Generic (PLEG): container finished" podID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerID="b4030d3306f46a34332c6f7410af9b79ae2faab4495a8e8cc91750cb850a5769" exitCode=0 Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.325072 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" event={"ID":"32a4f5ed-f364-473e-b70e-736ff25ad7cd","Type":"ContainerDied","Data":"b4030d3306f46a34332c6f7410af9b79ae2faab4495a8e8cc91750cb850a5769"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.325096 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" event={"ID":"32a4f5ed-f364-473e-b70e-736ff25ad7cd","Type":"ContainerDied","Data":"cae5a4f64795b9dc8f99228b5c7ac9cd68115d368d216c1d6b8f05a819b06096"} Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.325108 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cae5a4f64795b9dc8f99228b5c7ac9cd68115d368d216c1d6b8f05a819b06096" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.346346 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.355482 4618 scope.go:117] "RemoveContainer" containerID="df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.373667 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.382598 4618 scope.go:117] "RemoveContainer" containerID="d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.383004 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117\": container with ID starting with d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117 not found: ID does not exist" containerID="d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383044 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117"} err="failed to get container status \"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117\": rpc error: code = NotFound desc = could not find container \"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117\": container with ID starting with d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117 not found: ID does not exist" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383070 4618 scope.go:117] "RemoveContainer" containerID="df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.383395 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35\": container with ID starting with df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35 not found: ID does not exist" containerID="df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383491 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35"} err="failed to get container status \"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35\": rpc error: code = NotFound desc = could not find container \"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35\": container with ID starting with df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35 not found: ID does not exist" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383571 4618 scope.go:117] "RemoveContainer" containerID="d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383873 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117"} err="failed to get container status \"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117\": rpc error: code = NotFound desc = could not find container \"d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117\": container with ID starting with d6f8a510c1a1657fbe262bb4d7da5cb5f3d2c3156cea977dc41b53ccbfd2d117 not found: ID does not exist" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.383908 4618 scope.go:117] "RemoveContainer" containerID="df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.384216 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35"} err="failed to get container status \"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35\": rpc error: code = NotFound desc = could not find container \"df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35\": container with ID starting with df656b141b7bc9f89c793931dc30c1d234677f0ccf2783472d3b7beb9f0f2a35 not found: ID does not exist" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.391423 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.393862 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.422324 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.439297 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.453680 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.454155 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" containerName="nova-scheduler-scheduler" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454169 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" containerName="nova-scheduler-scheduler" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.454187 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="dnsmasq-dns" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454193 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="dnsmasq-dns" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.454209 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-log" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454215 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-log" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.454235 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="init" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454241 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="init" Jan 21 09:20:15 crc kubenswrapper[4618]: E0121 09:20:15.454249 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-metadata" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454254 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-metadata" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454419 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" containerName="nova-scheduler-scheduler" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454435 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-log" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454445 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" containerName="dnsmasq-dns" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.454458 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" containerName="nova-metadata-metadata" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.455479 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.457258 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.457393 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.459885 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.462479 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.474288 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.497562 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.497705 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.497821 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.497934 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data\") pod \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498018 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle\") pod \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498161 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kqq\" (UniqueName: \"kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq\") pod \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\" (UID: \"32a4f5ed-f364-473e-b70e-736ff25ad7cd\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498235 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l58h\" (UniqueName: \"kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h\") pod \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\" (UID: \"537b32bd-fcec-4a2d-8adf-fbe3d145a95d\") " Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498577 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52ck7\" (UniqueName: \"kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498775 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.498930 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.499044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.499106 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.499264 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.499324 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.501762 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq" (OuterVolumeSpecName: "kube-api-access-b7kqq") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "kube-api-access-b7kqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.504832 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h" (OuterVolumeSpecName: "kube-api-access-5l58h") pod "537b32bd-fcec-4a2d-8adf-fbe3d145a95d" (UID: "537b32bd-fcec-4a2d-8adf-fbe3d145a95d"). InnerVolumeSpecName "kube-api-access-5l58h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.536247 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "537b32bd-fcec-4a2d-8adf-fbe3d145a95d" (UID: "537b32bd-fcec-4a2d-8adf-fbe3d145a95d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.542960 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data" (OuterVolumeSpecName: "config-data") pod "537b32bd-fcec-4a2d-8adf-fbe3d145a95d" (UID: "537b32bd-fcec-4a2d-8adf-fbe3d145a95d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.551219 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config" (OuterVolumeSpecName: "config") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.553325 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.554510 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a4f5ed-f364-473e-b70e-736ff25ad7cd" (UID: "32a4f5ed-f364-473e-b70e-736ff25ad7cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.555347 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140504cb-90bd-4e75-8718-11a596f15f72" path="/var/lib/kubelet/pods/140504cb-90bd-4e75-8718-11a596f15f72/volumes" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.555990 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86e9c9c-e52d-4d68-b713-b9793049ceee" path="/var/lib/kubelet/pods/e86e9c9c-e52d-4d68-b713-b9793049ceee/volumes" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600188 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52ck7\" (UniqueName: \"kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600524 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600578 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600639 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600660 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600743 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600758 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600769 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a4f5ed-f364-473e-b70e-736ff25ad7cd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600777 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600785 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600794 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kqq\" (UniqueName: \"kubernetes.io/projected/32a4f5ed-f364-473e-b70e-736ff25ad7cd-kube-api-access-b7kqq\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.600802 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l58h\" (UniqueName: \"kubernetes.io/projected/537b32bd-fcec-4a2d-8adf-fbe3d145a95d-kube-api-access-5l58h\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.602287 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.604253 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.605592 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.606033 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.614892 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52ck7\" (UniqueName: \"kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7\") pod \"nova-metadata-0\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " pod="openstack/nova-metadata-0" Jan 21 09:20:15 crc kubenswrapper[4618]: I0121 09:20:15.774071 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:20:16 crc kubenswrapper[4618]: W0121 09:20:16.199975 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d9730db_82be_44ff_87a8_23912cbdb99b.slice/crio-476b78edff90f65dc9dc18ef0e530666342775360ec287eb67a2caefd74407f4 WatchSource:0}: Error finding container 476b78edff90f65dc9dc18ef0e530666342775360ec287eb67a2caefd74407f4: Status 404 returned error can't find the container with id 476b78edff90f65dc9dc18ef0e530666342775360ec287eb67a2caefd74407f4 Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.202821 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.335394 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerStarted","Data":"476b78edff90f65dc9dc18ef0e530666342775360ec287eb67a2caefd74407f4"} Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.338933 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerStarted","Data":"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37"} Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.338965 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerStarted","Data":"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30"} Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.341868 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-rjwsk" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.342025 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.360746 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3607300589999998 podStartE2EDuration="2.360730059s" podCreationTimestamp="2026-01-21 09:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:16.356771173 +0000 UTC m=+1015.107238490" watchObservedRunningTime="2026-01-21 09:20:16.360730059 +0000 UTC m=+1015.111197376" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.382353 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.419252 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-rjwsk"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.428508 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.435176 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.440985 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.442403 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.444503 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.450067 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.522294 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kccz\" (UniqueName: \"kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.522358 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.522424 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.625092 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kccz\" (UniqueName: \"kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.625166 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.625238 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.631086 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.631234 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.640778 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kccz\" (UniqueName: \"kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz\") pod \"nova-scheduler-0\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " pod="openstack/nova-scheduler-0" Jan 21 09:20:16 crc kubenswrapper[4618]: I0121 09:20:16.760323 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.172127 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.356769 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerStarted","Data":"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0"} Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.356837 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerStarted","Data":"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5"} Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.360296 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d","Type":"ContainerStarted","Data":"4037b30c9805ba652e9878de293c8d226de7f10d90655ddcdc5b4191d469e013"} Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.360358 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d","Type":"ContainerStarted","Data":"f0c188a1521e9f4a29de71c347d3d3f91a9b9a1b278d382abdf2366ae31495b0"} Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.375117 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.375102943 podStartE2EDuration="2.375102943s" podCreationTimestamp="2026-01-21 09:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:17.372402523 +0000 UTC m=+1016.122869840" watchObservedRunningTime="2026-01-21 09:20:17.375102943 +0000 UTC m=+1016.125570259" Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.389184 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.389169629 podStartE2EDuration="1.389169629s" podCreationTimestamp="2026-01-21 09:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:17.384412983 +0000 UTC m=+1016.134880300" watchObservedRunningTime="2026-01-21 09:20:17.389169629 +0000 UTC m=+1016.139636946" Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.551114 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a4f5ed-f364-473e-b70e-736ff25ad7cd" path="/var/lib/kubelet/pods/32a4f5ed-f364-473e-b70e-736ff25ad7cd/volumes" Jan 21 09:20:17 crc kubenswrapper[4618]: I0121 09:20:17.552242 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537b32bd-fcec-4a2d-8adf-fbe3d145a95d" path="/var/lib/kubelet/pods/537b32bd-fcec-4a2d-8adf-fbe3d145a95d/volumes" Jan 21 09:20:20 crc kubenswrapper[4618]: I0121 09:20:20.293040 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 09:20:20 crc kubenswrapper[4618]: I0121 09:20:20.774256 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:20:20 crc kubenswrapper[4618]: I0121 09:20:20.774324 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:20:21 crc kubenswrapper[4618]: I0121 09:20:21.761473 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 09:20:22 crc kubenswrapper[4618]: I0121 09:20:22.674969 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 21 09:20:24 crc kubenswrapper[4618]: I0121 09:20:24.708700 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:20:24 crc kubenswrapper[4618]: I0121 09:20:24.709047 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:20:25 crc kubenswrapper[4618]: I0121 09:20:25.775341 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 09:20:25 crc kubenswrapper[4618]: I0121 09:20:25.775420 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 09:20:25 crc kubenswrapper[4618]: I0121 09:20:25.790306 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:25 crc kubenswrapper[4618]: I0121 09:20:25.790340 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.760528 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.787406 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.795275 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.795311 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.959459 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:20:26 crc kubenswrapper[4618]: I0121 09:20:26.959529 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:20:27 crc kubenswrapper[4618]: I0121 09:20:27.489258 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.713048 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.714095 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.714382 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.714413 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.716514 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.717488 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.876691 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.878539 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.890499 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.900637 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz84c\" (UniqueName: \"kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.900733 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.901025 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.901095 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.901209 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:34 crc kubenswrapper[4618]: I0121 09:20:34.901370 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003350 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003413 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz84c\" (UniqueName: \"kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003469 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003592 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003642 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.003701 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.004587 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.005233 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.005467 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.006000 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.006070 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.024581 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz84c\" (UniqueName: \"kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c\") pod \"dnsmasq-dns-fcd6f8f8f-4cztp\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.202954 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.645455 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:20:35 crc kubenswrapper[4618]: W0121 09:20:35.647742 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb7bafe_9ad7_4ecf_9bd8_90c11bddc43d.slice/crio-402b9bdd57418b4292b1002d8013dfa3410cf0edbbb091b4ea1ae977dd81a92f WatchSource:0}: Error finding container 402b9bdd57418b4292b1002d8013dfa3410cf0edbbb091b4ea1ae977dd81a92f: Status 404 returned error can't find the container with id 402b9bdd57418b4292b1002d8013dfa3410cf0edbbb091b4ea1ae977dd81a92f Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.779026 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.782700 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 09:20:35 crc kubenswrapper[4618]: I0121 09:20:35.792408 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.563118 4618 generic.go:334] "Generic (PLEG): container finished" podID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerID="3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d" exitCode=0 Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.564392 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" event={"ID":"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d","Type":"ContainerDied","Data":"3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d"} Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.564452 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.564475 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" event={"ID":"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d","Type":"ContainerStarted","Data":"402b9bdd57418b4292b1002d8013dfa3410cf0edbbb091b4ea1ae977dd81a92f"} Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.564716 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-central-agent" containerID="cri-o://fd4775994f261f1543cc3ef4d4ab1c6349da0abbee527e7d707780fb541ce53b" gracePeriod=30 Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.565897 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="proxy-httpd" containerID="cri-o://c9ad8aacd8252338c28318b975f968ea17f799660546b6e8c06e9d3782f664b5" gracePeriod=30 Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.565994 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="sg-core" containerID="cri-o://8556a0e1a4f75152da5542599b05f270d7e7a8c6414427c95c8543069e7d0b8f" gracePeriod=30 Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.566049 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-notification-agent" containerID="cri-o://13b8fc9f19f652ea23485a00955662e207dd7cd5307bf487db73ef156e003851" gracePeriod=30 Jan 21 09:20:36 crc kubenswrapper[4618]: I0121 09:20:36.577601 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.100462 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577158 4618 generic.go:334] "Generic (PLEG): container finished" podID="963186b6-ebce-48d8-8e25-36c695cee351" containerID="c9ad8aacd8252338c28318b975f968ea17f799660546b6e8c06e9d3782f664b5" exitCode=0 Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577490 4618 generic.go:334] "Generic (PLEG): container finished" podID="963186b6-ebce-48d8-8e25-36c695cee351" containerID="8556a0e1a4f75152da5542599b05f270d7e7a8c6414427c95c8543069e7d0b8f" exitCode=2 Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577501 4618 generic.go:334] "Generic (PLEG): container finished" podID="963186b6-ebce-48d8-8e25-36c695cee351" containerID="fd4775994f261f1543cc3ef4d4ab1c6349da0abbee527e7d707780fb541ce53b" exitCode=0 Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577186 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerDied","Data":"c9ad8aacd8252338c28318b975f968ea17f799660546b6e8c06e9d3782f664b5"} Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577595 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerDied","Data":"8556a0e1a4f75152da5542599b05f270d7e7a8c6414427c95c8543069e7d0b8f"} Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.577615 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerDied","Data":"fd4775994f261f1543cc3ef4d4ab1c6349da0abbee527e7d707780fb541ce53b"} Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.580697 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" event={"ID":"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d","Type":"ContainerStarted","Data":"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464"} Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.580936 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-log" containerID="cri-o://f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30" gracePeriod=30 Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.581032 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-api" containerID="cri-o://5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37" gracePeriod=30 Jan 21 09:20:37 crc kubenswrapper[4618]: I0121 09:20:37.606972 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" podStartSLOduration=3.606950191 podStartE2EDuration="3.606950191s" podCreationTimestamp="2026-01-21 09:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:37.601327166 +0000 UTC m=+1036.351794472" watchObservedRunningTime="2026-01-21 09:20:37.606950191 +0000 UTC m=+1036.357417508" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.596957 4618 generic.go:334] "Generic (PLEG): container finished" podID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerID="f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30" exitCode=143 Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.597152 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerDied","Data":"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30"} Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.611753 4618 generic.go:334] "Generic (PLEG): container finished" podID="963186b6-ebce-48d8-8e25-36c695cee351" containerID="13b8fc9f19f652ea23485a00955662e207dd7cd5307bf487db73ef156e003851" exitCode=0 Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.611894 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerDied","Data":"13b8fc9f19f652ea23485a00955662e207dd7cd5307bf487db73ef156e003851"} Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.612961 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.728822 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897320 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897395 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897425 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6ht4\" (UniqueName: \"kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897444 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897530 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897592 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897615 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.897699 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle\") pod \"963186b6-ebce-48d8-8e25-36c695cee351\" (UID: \"963186b6-ebce-48d8-8e25-36c695cee351\") " Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.899608 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.901932 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.906385 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts" (OuterVolumeSpecName: "scripts") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.910284 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4" (OuterVolumeSpecName: "kube-api-access-f6ht4") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "kube-api-access-f6ht4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.930945 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.946326 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.970841 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:38 crc kubenswrapper[4618]: I0121 09:20:38.987993 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data" (OuterVolumeSpecName: "config-data") pod "963186b6-ebce-48d8-8e25-36c695cee351" (UID: "963186b6-ebce-48d8-8e25-36c695cee351"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.001686 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.001775 4618 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.001828 4618 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963186b6-ebce-48d8-8e25-36c695cee351-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.001889 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.001942 4618 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.002582 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.002617 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6ht4\" (UniqueName: \"kubernetes.io/projected/963186b6-ebce-48d8-8e25-36c695cee351-kube-api-access-f6ht4\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.002661 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963186b6-ebce-48d8-8e25-36c695cee351-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.583293 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.620786 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"963186b6-ebce-48d8-8e25-36c695cee351","Type":"ContainerDied","Data":"e31bf467f8d93c05449feda295472b8b1d94217375f1ba84229ca0419cdc26de"} Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.621351 4618 scope.go:117] "RemoveContainer" containerID="c9ad8aacd8252338c28318b975f968ea17f799660546b6e8c06e9d3782f664b5" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.621548 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.622673 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52kzz\" (UniqueName: \"kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz\") pod \"9bc65221-2632-4f13-94fb-54dd961ee4d3\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.622821 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data\") pod \"9bc65221-2632-4f13-94fb-54dd961ee4d3\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.622936 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle\") pod \"9bc65221-2632-4f13-94fb-54dd961ee4d3\" (UID: \"9bc65221-2632-4f13-94fb-54dd961ee4d3\") " Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.627337 4618 generic.go:334] "Generic (PLEG): container finished" podID="9bc65221-2632-4f13-94fb-54dd961ee4d3" containerID="b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9" exitCode=137 Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.627885 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.628034 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bc65221-2632-4f13-94fb-54dd961ee4d3","Type":"ContainerDied","Data":"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9"} Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.628065 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bc65221-2632-4f13-94fb-54dd961ee4d3","Type":"ContainerDied","Data":"f38ad41e1b233c9d971b8c3367ab5e88f93e7041ae80ba9eba72064b4d46000c"} Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.628929 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz" (OuterVolumeSpecName: "kube-api-access-52kzz") pod "9bc65221-2632-4f13-94fb-54dd961ee4d3" (UID: "9bc65221-2632-4f13-94fb-54dd961ee4d3"). InnerVolumeSpecName "kube-api-access-52kzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.652542 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.659375 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data" (OuterVolumeSpecName: "config-data") pod "9bc65221-2632-4f13-94fb-54dd961ee4d3" (UID: "9bc65221-2632-4f13-94fb-54dd961ee4d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.667249 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.670958 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.671359 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="proxy-httpd" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671377 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="proxy-httpd" Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.671397 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="sg-core" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671404 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="sg-core" Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.671419 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc65221-2632-4f13-94fb-54dd961ee4d3" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671425 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc65221-2632-4f13-94fb-54dd961ee4d3" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.671446 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-central-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671452 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-central-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.671471 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-notification-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671479 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-notification-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671642 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-notification-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671657 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="sg-core" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671665 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc65221-2632-4f13-94fb-54dd961ee4d3" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671677 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="ceilometer-central-agent" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.671689 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="963186b6-ebce-48d8-8e25-36c695cee351" containerName="proxy-httpd" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.672264 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc65221-2632-4f13-94fb-54dd961ee4d3" (UID: "9bc65221-2632-4f13-94fb-54dd961ee4d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.673229 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.676365 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.677182 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.677429 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.690583 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.730739 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.730764 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc65221-2632-4f13-94fb-54dd961ee4d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.730775 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52kzz\" (UniqueName: \"kubernetes.io/projected/9bc65221-2632-4f13-94fb-54dd961ee4d3-kube-api-access-52kzz\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.749877 4618 scope.go:117] "RemoveContainer" containerID="8556a0e1a4f75152da5542599b05f270d7e7a8c6414427c95c8543069e7d0b8f" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.774617 4618 scope.go:117] "RemoveContainer" containerID="13b8fc9f19f652ea23485a00955662e207dd7cd5307bf487db73ef156e003851" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.805016 4618 scope.go:117] "RemoveContainer" containerID="fd4775994f261f1543cc3ef4d4ab1c6349da0abbee527e7d707780fb541ce53b" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.836264 4618 scope.go:117] "RemoveContainer" containerID="b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.839733 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7fg\" (UniqueName: \"kubernetes.io/projected/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-kube-api-access-ht7fg\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.839967 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.840017 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.840241 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-config-data\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.840762 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-run-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.840794 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-log-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.840817 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.841443 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-scripts\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.857089 4618 scope.go:117] "RemoveContainer" containerID="b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9" Jan 21 09:20:39 crc kubenswrapper[4618]: E0121 09:20:39.857688 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9\": container with ID starting with b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9 not found: ID does not exist" containerID="b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.857738 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9"} err="failed to get container status \"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9\": rpc error: code = NotFound desc = could not find container \"b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9\": container with ID starting with b6b0ccbbad58cfb8750c1bab711772da8db62368284b79ecd516d3a3b38e30d9 not found: ID does not exist" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.942732 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.942882 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943037 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-config-data\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943245 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-run-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943344 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-log-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943417 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943520 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-scripts\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943673 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7fg\" (UniqueName: \"kubernetes.io/projected/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-kube-api-access-ht7fg\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.943945 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-log-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.944192 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-run-httpd\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.950756 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.953987 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.956573 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.956589 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-scripts\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.956881 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-config-data\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.966503 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7fg\" (UniqueName: \"kubernetes.io/projected/c2ceaf0a-1783-4b11-9f67-a5c8948c589d-kube-api-access-ht7fg\") pod \"ceilometer-0\" (UID: \"c2ceaf0a-1783-4b11-9f67-a5c8948c589d\") " pod="openstack/ceilometer-0" Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.971560 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.984544 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.996218 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:39 crc kubenswrapper[4618]: I0121 09:20:39.997561 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.004638 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.004895 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.005404 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.012176 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.043506 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.044914 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd777\" (UniqueName: \"kubernetes.io/projected/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-kube-api-access-sd777\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.044989 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.045225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.045415 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.045450 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.146317 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd777\" (UniqueName: \"kubernetes.io/projected/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-kube-api-access-sd777\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.146366 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.146429 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.146496 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.146514 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.151547 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.151670 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.152017 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.152341 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.160804 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd777\" (UniqueName: \"kubernetes.io/projected/d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa-kube-api-access-sd777\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.314788 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.458503 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.642504 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2ceaf0a-1783-4b11-9f67-a5c8948c589d","Type":"ContainerStarted","Data":"69f56ba52f89caa2a5290709bd924509672510deffc1d4f31d0a73255d062a40"} Jan 21 09:20:40 crc kubenswrapper[4618]: I0121 09:20:40.766573 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 09:20:40 crc kubenswrapper[4618]: W0121 09:20:40.783660 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3c60646_177a_4ed0_ab65_6a9ba9f3b7aa.slice/crio-2440de790864753064f1c35ab884583d93484803d35a0f0ffe720d91ed7b8317 WatchSource:0}: Error finding container 2440de790864753064f1c35ab884583d93484803d35a0f0ffe720d91ed7b8317: Status 404 returned error can't find the container with id 2440de790864753064f1c35ab884583d93484803d35a0f0ffe720d91ed7b8317 Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.094322 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.178122 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data\") pod \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.178260 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle\") pod \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.178303 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgxg\" (UniqueName: \"kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg\") pod \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.178359 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs\") pod \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\" (UID: \"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c\") " Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.179885 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs" (OuterVolumeSpecName: "logs") pod "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" (UID: "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.185090 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg" (OuterVolumeSpecName: "kube-api-access-zhgxg") pod "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" (UID: "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c"). InnerVolumeSpecName "kube-api-access-zhgxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.217026 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" (UID: "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.221778 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data" (OuterVolumeSpecName: "config-data") pod "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" (UID: "475aa2ca-c0f7-47e5-8ea8-7274ebbc292c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.281645 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.281680 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.281696 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.281709 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhgxg\" (UniqueName: \"kubernetes.io/projected/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c-kube-api-access-zhgxg\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.556926 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963186b6-ebce-48d8-8e25-36c695cee351" path="/var/lib/kubelet/pods/963186b6-ebce-48d8-8e25-36c695cee351/volumes" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.563387 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc65221-2632-4f13-94fb-54dd961ee4d3" path="/var/lib/kubelet/pods/9bc65221-2632-4f13-94fb-54dd961ee4d3/volumes" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.662030 4618 generic.go:334] "Generic (PLEG): container finished" podID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerID="5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37" exitCode=0 Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.662090 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.662117 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerDied","Data":"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37"} Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.662166 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"475aa2ca-c0f7-47e5-8ea8-7274ebbc292c","Type":"ContainerDied","Data":"d64bacbd8562fc480f4c514f7262ff8e8ca84d0f98354aa2b5e94e82b65164ae"} Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.662187 4618 scope.go:117] "RemoveContainer" containerID="5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.664693 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2ceaf0a-1783-4b11-9f67-a5c8948c589d","Type":"ContainerStarted","Data":"2b491e74ea0d66b0199b0cdf8a8284be93ffb80ac27c1f9e71001f17e69ff736"} Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.667917 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa","Type":"ContainerStarted","Data":"09738197b3f4dd92893006d16882ba3d930287f6e8b140dee4e812788c4d1df9"} Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.667959 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa","Type":"ContainerStarted","Data":"2440de790864753064f1c35ab884583d93484803d35a0f0ffe720d91ed7b8317"} Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.691040 4618 scope.go:117] "RemoveContainer" containerID="f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.691720 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.703689 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.708986 4618 scope.go:117] "RemoveContainer" containerID="5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37" Jan 21 09:20:41 crc kubenswrapper[4618]: E0121 09:20:41.715398 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37\": container with ID starting with 5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37 not found: ID does not exist" containerID="5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.715439 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37"} err="failed to get container status \"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37\": rpc error: code = NotFound desc = could not find container \"5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37\": container with ID starting with 5b41f68f2fef960b4218868612248bd85ddd5aa944168ffee68703e2f1024d37 not found: ID does not exist" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.715468 4618 scope.go:117] "RemoveContainer" containerID="f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30" Jan 21 09:20:41 crc kubenswrapper[4618]: E0121 09:20:41.715775 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30\": container with ID starting with f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30 not found: ID does not exist" containerID="f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.715824 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30"} err="failed to get container status \"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30\": rpc error: code = NotFound desc = could not find container \"f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30\": container with ID starting with f50ca2c20a31489667ba49254e84623311d75ef874e62a56baae5b112a4a5f30 not found: ID does not exist" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.730862 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:41 crc kubenswrapper[4618]: E0121 09:20:41.731316 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-log" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.731335 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-log" Jan 21 09:20:41 crc kubenswrapper[4618]: E0121 09:20:41.731351 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-api" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.731357 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-api" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.731559 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-log" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.731583 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" containerName="nova-api-api" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.732290 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.732272531 podStartE2EDuration="2.732272531s" podCreationTimestamp="2026-01-21 09:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:41.716030683 +0000 UTC m=+1040.466498001" watchObservedRunningTime="2026-01-21 09:20:41.732272531 +0000 UTC m=+1040.482739847" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.732543 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.742327 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.742553 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.742697 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.746985 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793038 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793356 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793378 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793400 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgqf\" (UniqueName: \"kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793456 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.793493 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.894949 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895073 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895116 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895173 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgqf\" (UniqueName: \"kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895251 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895316 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.895880 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.898350 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.900810 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.900816 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.901352 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:41 crc kubenswrapper[4618]: I0121 09:20:41.923585 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgqf\" (UniqueName: \"kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf\") pod \"nova-api-0\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " pod="openstack/nova-api-0" Jan 21 09:20:42 crc kubenswrapper[4618]: I0121 09:20:42.168353 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:20:42 crc kubenswrapper[4618]: I0121 09:20:42.600608 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:42 crc kubenswrapper[4618]: I0121 09:20:42.679049 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2ceaf0a-1783-4b11-9f67-a5c8948c589d","Type":"ContainerStarted","Data":"28e7b53a126e777d2fe3fd43f587da437bd24857174c2d55358faa4d405998d6"} Jan 21 09:20:42 crc kubenswrapper[4618]: I0121 09:20:42.680349 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerStarted","Data":"5d39851aebdfe9e4ce53899720b26359df4bc73ab2d7ec3b9d2046a9b06c3526"} Jan 21 09:20:43 crc kubenswrapper[4618]: I0121 09:20:43.552906 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475aa2ca-c0f7-47e5-8ea8-7274ebbc292c" path="/var/lib/kubelet/pods/475aa2ca-c0f7-47e5-8ea8-7274ebbc292c/volumes" Jan 21 09:20:43 crc kubenswrapper[4618]: I0121 09:20:43.690413 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerStarted","Data":"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3"} Jan 21 09:20:43 crc kubenswrapper[4618]: I0121 09:20:43.690468 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerStarted","Data":"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b"} Jan 21 09:20:43 crc kubenswrapper[4618]: I0121 09:20:43.692518 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2ceaf0a-1783-4b11-9f67-a5c8948c589d","Type":"ContainerStarted","Data":"3923042191eea171cf1db380ef97ad2ca44b4775668efa796bc38571669b1e49"} Jan 21 09:20:43 crc kubenswrapper[4618]: I0121 09:20:43.712718 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.712704419 podStartE2EDuration="2.712704419s" podCreationTimestamp="2026-01-21 09:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:43.705870316 +0000 UTC m=+1042.456337633" watchObservedRunningTime="2026-01-21 09:20:43.712704419 +0000 UTC m=+1042.463171737" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.204359 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.267706 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.268080 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="dnsmasq-dns" containerID="cri-o://5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa" gracePeriod=10 Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.317238 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.696695 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.724427 4618 generic.go:334] "Generic (PLEG): container finished" podID="cfcbb487-fc93-4137-99fa-32e268de295a" containerID="5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa" exitCode=0 Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.724485 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" event={"ID":"cfcbb487-fc93-4137-99fa-32e268de295a","Type":"ContainerDied","Data":"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa"} Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.724517 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" event={"ID":"cfcbb487-fc93-4137-99fa-32e268de295a","Type":"ContainerDied","Data":"8bae8bbfcc8125ad08ad402f697c2b07fb0e8f753f80ebadaf38dab4da0e0aa9"} Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.724537 4618 scope.go:117] "RemoveContainer" containerID="5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.724728 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-wjw8p" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.766812 4618 scope.go:117] "RemoveContainer" containerID="7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.774989 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.775160 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.775201 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pg9m\" (UniqueName: \"kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.775313 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.775429 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.775479 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config\") pod \"cfcbb487-fc93-4137-99fa-32e268de295a\" (UID: \"cfcbb487-fc93-4137-99fa-32e268de295a\") " Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.787899 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m" (OuterVolumeSpecName: "kube-api-access-8pg9m") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "kube-api-access-8pg9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.798061 4618 scope.go:117] "RemoveContainer" containerID="5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa" Jan 21 09:20:45 crc kubenswrapper[4618]: E0121 09:20:45.798920 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa\": container with ID starting with 5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa not found: ID does not exist" containerID="5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.798958 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa"} err="failed to get container status \"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa\": rpc error: code = NotFound desc = could not find container \"5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa\": container with ID starting with 5a127325077e22da60d328fbc4248975d6f1fe7c4584de69d7fe371b263431aa not found: ID does not exist" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.798986 4618 scope.go:117] "RemoveContainer" containerID="7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652" Jan 21 09:20:45 crc kubenswrapper[4618]: E0121 09:20:45.799929 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652\": container with ID starting with 7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652 not found: ID does not exist" containerID="7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.799992 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652"} err="failed to get container status \"7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652\": rpc error: code = NotFound desc = could not find container \"7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652\": container with ID starting with 7e8bc2444dd6fd93d6f2d177c22930cbca1fd10c154dd081aa316905ebeb6652 not found: ID does not exist" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.833843 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.838178 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.846700 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.847205 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.847530 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config" (OuterVolumeSpecName: "config") pod "cfcbb487-fc93-4137-99fa-32e268de295a" (UID: "cfcbb487-fc93-4137-99fa-32e268de295a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878017 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878049 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pg9m\" (UniqueName: \"kubernetes.io/projected/cfcbb487-fc93-4137-99fa-32e268de295a-kube-api-access-8pg9m\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878063 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878076 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878086 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:45 crc kubenswrapper[4618]: I0121 09:20:45.878096 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfcbb487-fc93-4137-99fa-32e268de295a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:46 crc kubenswrapper[4618]: I0121 09:20:46.053110 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:46 crc kubenswrapper[4618]: I0121 09:20:46.059670 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-wjw8p"] Jan 21 09:20:47 crc kubenswrapper[4618]: I0121 09:20:47.548321 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" path="/var/lib/kubelet/pods/cfcbb487-fc93-4137-99fa-32e268de295a/volumes" Jan 21 09:20:48 crc kubenswrapper[4618]: I0121 09:20:48.770099 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2ceaf0a-1783-4b11-9f67-a5c8948c589d","Type":"ContainerStarted","Data":"028c7fbc1460593d01852399202d916ad5ff589a2fa95ea1e286620ba107084e"} Jan 21 09:20:48 crc kubenswrapper[4618]: I0121 09:20:48.770775 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 09:20:48 crc kubenswrapper[4618]: I0121 09:20:48.791787 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.535400326 podStartE2EDuration="9.791772887s" podCreationTimestamp="2026-01-21 09:20:39 +0000 UTC" firstStartedPulling="2026-01-21 09:20:40.480923434 +0000 UTC m=+1039.231390752" lastFinishedPulling="2026-01-21 09:20:47.737295995 +0000 UTC m=+1046.487763313" observedRunningTime="2026-01-21 09:20:48.78923864 +0000 UTC m=+1047.539705957" watchObservedRunningTime="2026-01-21 09:20:48.791772887 +0000 UTC m=+1047.542240204" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.315343 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.339493 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.811114 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.928980 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6qr5n"] Jan 21 09:20:50 crc kubenswrapper[4618]: E0121 09:20:50.929401 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="init" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.929421 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="init" Jan 21 09:20:50 crc kubenswrapper[4618]: E0121 09:20:50.929443 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="dnsmasq-dns" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.929450 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="dnsmasq-dns" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.929683 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcbb487-fc93-4137-99fa-32e268de295a" containerName="dnsmasq-dns" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.930323 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.932867 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.933030 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.946548 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6qr5n"] Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.974378 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.974470 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.974591 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzcv7\" (UniqueName: \"kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:50 crc kubenswrapper[4618]: I0121 09:20:50.974827 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.077843 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzcv7\" (UniqueName: \"kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.078084 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.078312 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.078382 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.085524 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.085813 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.086978 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.093929 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzcv7\" (UniqueName: \"kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7\") pod \"nova-cell1-cell-mapping-6qr5n\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.248973 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.679906 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6qr5n"] Jan 21 09:20:51 crc kubenswrapper[4618]: W0121 09:20:51.680383 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd873f7b_8ec3_44de_85f7_073977049c57.slice/crio-5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d WatchSource:0}: Error finding container 5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d: Status 404 returned error can't find the container with id 5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d Jan 21 09:20:51 crc kubenswrapper[4618]: I0121 09:20:51.808848 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6qr5n" event={"ID":"cd873f7b-8ec3-44de-85f7-073977049c57","Type":"ContainerStarted","Data":"5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d"} Jan 21 09:20:52 crc kubenswrapper[4618]: I0121 09:20:52.168968 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:20:52 crc kubenswrapper[4618]: I0121 09:20:52.169036 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:20:52 crc kubenswrapper[4618]: I0121 09:20:52.823250 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6qr5n" event={"ID":"cd873f7b-8ec3-44de-85f7-073977049c57","Type":"ContainerStarted","Data":"9166ef3a70452b870b8baf61b10cfd450d3a7ab1e52f13b98739fb31de4d04d7"} Jan 21 09:20:52 crc kubenswrapper[4618]: I0121 09:20:52.846400 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6qr5n" podStartSLOduration=2.846375928 podStartE2EDuration="2.846375928s" podCreationTimestamp="2026-01-21 09:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:20:52.8380342 +0000 UTC m=+1051.588501517" watchObservedRunningTime="2026-01-21 09:20:52.846375928 +0000 UTC m=+1051.596843246" Jan 21 09:20:53 crc kubenswrapper[4618]: I0121 09:20:53.179370 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:53 crc kubenswrapper[4618]: I0121 09:20:53.179371 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:20:55 crc kubenswrapper[4618]: I0121 09:20:55.856638 4618 generic.go:334] "Generic (PLEG): container finished" podID="cd873f7b-8ec3-44de-85f7-073977049c57" containerID="9166ef3a70452b870b8baf61b10cfd450d3a7ab1e52f13b98739fb31de4d04d7" exitCode=0 Jan 21 09:20:55 crc kubenswrapper[4618]: I0121 09:20:55.856829 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6qr5n" event={"ID":"cd873f7b-8ec3-44de-85f7-073977049c57","Type":"ContainerDied","Data":"9166ef3a70452b870b8baf61b10cfd450d3a7ab1e52f13b98739fb31de4d04d7"} Jan 21 09:20:56 crc kubenswrapper[4618]: I0121 09:20:56.958801 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:20:56 crc kubenswrapper[4618]: I0121 09:20:56.959082 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:20:56 crc kubenswrapper[4618]: I0121 09:20:56.959134 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:20:56 crc kubenswrapper[4618]: I0121 09:20:56.960167 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:20:56 crc kubenswrapper[4618]: I0121 09:20:56.960232 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210" gracePeriod=600 Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.223337 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.314254 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data\") pod \"cd873f7b-8ec3-44de-85f7-073977049c57\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.315507 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzcv7\" (UniqueName: \"kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7\") pod \"cd873f7b-8ec3-44de-85f7-073977049c57\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.315669 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle\") pod \"cd873f7b-8ec3-44de-85f7-073977049c57\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.315731 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts\") pod \"cd873f7b-8ec3-44de-85f7-073977049c57\" (UID: \"cd873f7b-8ec3-44de-85f7-073977049c57\") " Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.321232 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts" (OuterVolumeSpecName: "scripts") pod "cd873f7b-8ec3-44de-85f7-073977049c57" (UID: "cd873f7b-8ec3-44de-85f7-073977049c57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.321717 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7" (OuterVolumeSpecName: "kube-api-access-kzcv7") pod "cd873f7b-8ec3-44de-85f7-073977049c57" (UID: "cd873f7b-8ec3-44de-85f7-073977049c57"). InnerVolumeSpecName "kube-api-access-kzcv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.342889 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data" (OuterVolumeSpecName: "config-data") pod "cd873f7b-8ec3-44de-85f7-073977049c57" (UID: "cd873f7b-8ec3-44de-85f7-073977049c57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.343870 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd873f7b-8ec3-44de-85f7-073977049c57" (UID: "cd873f7b-8ec3-44de-85f7-073977049c57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.419436 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzcv7\" (UniqueName: \"kubernetes.io/projected/cd873f7b-8ec3-44de-85f7-073977049c57-kube-api-access-kzcv7\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.419471 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.419482 4618 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.419490 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd873f7b-8ec3-44de-85f7-073977049c57-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.879976 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210" exitCode=0 Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.880033 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210"} Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.880366 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b"} Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.880388 4618 scope.go:117] "RemoveContainer" containerID="b58e609790f66ef2752d711bb33506652d1731feac0ae2d67f3b94e098385deb" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.882760 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6qr5n" event={"ID":"cd873f7b-8ec3-44de-85f7-073977049c57","Type":"ContainerDied","Data":"5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d"} Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.882808 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5384e6334fd72674723b83199be16fc647eef43b3283540658f2a8c258d7319d" Jan 21 09:20:57 crc kubenswrapper[4618]: I0121 09:20:57.882836 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6qr5n" Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.071920 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.073027 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-log" containerID="cri-o://1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b" gracePeriod=30 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.073448 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-api" containerID="cri-o://0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3" gracePeriod=30 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.112251 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.115191 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" containerName="nova-scheduler-scheduler" containerID="cri-o://4037b30c9805ba652e9878de293c8d226de7f10d90655ddcdc5b4191d469e013" gracePeriod=30 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.139356 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.139603 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" containerID="cri-o://2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5" gracePeriod=30 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.139851 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" containerID="cri-o://ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0" gracePeriod=30 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.900550 4618 generic.go:334] "Generic (PLEG): container finished" podID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerID="1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b" exitCode=143 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.900641 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerDied","Data":"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b"} Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.903505 4618 generic.go:334] "Generic (PLEG): container finished" podID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerID="2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5" exitCode=143 Jan 21 09:20:58 crc kubenswrapper[4618]: I0121 09:20:58.903567 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerDied","Data":"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5"} Jan 21 09:20:59 crc kubenswrapper[4618]: I0121 09:20:59.927945 4618 generic.go:334] "Generic (PLEG): container finished" podID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" containerID="4037b30c9805ba652e9878de293c8d226de7f10d90655ddcdc5b4191d469e013" exitCode=0 Jan 21 09:20:59 crc kubenswrapper[4618]: I0121 09:20:59.928041 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d","Type":"ContainerDied","Data":"4037b30c9805ba652e9878de293c8d226de7f10d90655ddcdc5b4191d469e013"} Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.141179 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.176857 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle\") pod \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.176940 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data\") pod \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.176978 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kccz\" (UniqueName: \"kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz\") pod \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\" (UID: \"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d\") " Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.182161 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz" (OuterVolumeSpecName: "kube-api-access-2kccz") pod "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" (UID: "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d"). InnerVolumeSpecName "kube-api-access-2kccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.199608 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" (UID: "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.200315 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data" (OuterVolumeSpecName: "config-data") pod "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" (UID: "6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.280207 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.280243 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kccz\" (UniqueName: \"kubernetes.io/projected/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-kube-api-access-2kccz\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.280258 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.942135 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d","Type":"ContainerDied","Data":"f0c188a1521e9f4a29de71c347d3d3f91a9b9a1b278d382abdf2366ae31495b0"} Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.942633 4618 scope.go:117] "RemoveContainer" containerID="4037b30c9805ba652e9878de293c8d226de7f10d90655ddcdc5b4191d469e013" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.942381 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.976460 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.985988 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.994857 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:21:00 crc kubenswrapper[4618]: E0121 09:21:00.995340 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd873f7b-8ec3-44de-85f7-073977049c57" containerName="nova-manage" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.995361 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd873f7b-8ec3-44de-85f7-073977049c57" containerName="nova-manage" Jan 21 09:21:00 crc kubenswrapper[4618]: E0121 09:21:00.995413 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" containerName="nova-scheduler-scheduler" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.995424 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" containerName="nova-scheduler-scheduler" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.995621 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" containerName="nova-scheduler-scheduler" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.995658 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd873f7b-8ec3-44de-85f7-073977049c57" containerName="nova-manage" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.996344 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:21:00 crc kubenswrapper[4618]: I0121 09:21:00.998845 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.000177 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.198459 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-config-data\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.198524 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.199245 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tt5\" (UniqueName: \"kubernetes.io/projected/17677a46-bed7-4316-91a1-e7d842f83d91-kube-api-access-s9tt5\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.265986 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:37378->10.217.0.196:8775: read: connection reset by peer" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.266116 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:37372->10.217.0.196:8775: read: connection reset by peer" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.301434 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tt5\" (UniqueName: \"kubernetes.io/projected/17677a46-bed7-4316-91a1-e7d842f83d91-kube-api-access-s9tt5\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.301585 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-config-data\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.301644 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.307042 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-config-data\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.307190 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17677a46-bed7-4316-91a1-e7d842f83d91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.317313 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tt5\" (UniqueName: \"kubernetes.io/projected/17677a46-bed7-4316-91a1-e7d842f83d91-kube-api-access-s9tt5\") pod \"nova-scheduler-0\" (UID: \"17677a46-bed7-4316-91a1-e7d842f83d91\") " pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.553920 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d" path="/var/lib/kubelet/pods/6ac275eb-26fa-4e54-8c8f-e9a0883d8d0d/volumes" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.617429 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.624926 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.690850 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.815620 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle\") pod \"0d9730db-82be-44ff-87a8-23912cbdb99b\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816354 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816435 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data\") pod \"0d9730db-82be-44ff-87a8-23912cbdb99b\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816503 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816571 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816618 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816663 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs\") pod \"0d9730db-82be-44ff-87a8-23912cbdb99b\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816792 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csgqf\" (UniqueName: \"kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816824 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52ck7\" (UniqueName: \"kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7\") pod \"0d9730db-82be-44ff-87a8-23912cbdb99b\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.816972 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs\") pod \"0d9730db-82be-44ff-87a8-23912cbdb99b\" (UID: \"0d9730db-82be-44ff-87a8-23912cbdb99b\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.817041 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs\") pod \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\" (UID: \"e66513b9-5fbe-4c15-9e35-e35f2c12cd16\") " Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.817240 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs" (OuterVolumeSpecName: "logs") pod "0d9730db-82be-44ff-87a8-23912cbdb99b" (UID: "0d9730db-82be-44ff-87a8-23912cbdb99b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.817377 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs" (OuterVolumeSpecName: "logs") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.817961 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.818022 4618 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d9730db-82be-44ff-87a8-23912cbdb99b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.821694 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7" (OuterVolumeSpecName: "kube-api-access-52ck7") pod "0d9730db-82be-44ff-87a8-23912cbdb99b" (UID: "0d9730db-82be-44ff-87a8-23912cbdb99b"). InnerVolumeSpecName "kube-api-access-52ck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.824407 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf" (OuterVolumeSpecName: "kube-api-access-csgqf") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "kube-api-access-csgqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.841028 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data" (OuterVolumeSpecName: "config-data") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.846453 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.848281 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data" (OuterVolumeSpecName: "config-data") pod "0d9730db-82be-44ff-87a8-23912cbdb99b" (UID: "0d9730db-82be-44ff-87a8-23912cbdb99b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.849298 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d9730db-82be-44ff-87a8-23912cbdb99b" (UID: "0d9730db-82be-44ff-87a8-23912cbdb99b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.864887 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0d9730db-82be-44ff-87a8-23912cbdb99b" (UID: "0d9730db-82be-44ff-87a8-23912cbdb99b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.867983 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.874894 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e66513b9-5fbe-4c15-9e35-e35f2c12cd16" (UID: "e66513b9-5fbe-4c15-9e35-e35f2c12cd16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.926728 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927015 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927064 4618 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927087 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927100 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csgqf\" (UniqueName: \"kubernetes.io/projected/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-kube-api-access-csgqf\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927129 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52ck7\" (UniqueName: \"kubernetes.io/projected/0d9730db-82be-44ff-87a8-23912cbdb99b-kube-api-access-52ck7\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927157 4618 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927170 4618 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e66513b9-5fbe-4c15-9e35-e35f2c12cd16-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.927181 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d9730db-82be-44ff-87a8-23912cbdb99b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.971568 4618 generic.go:334] "Generic (PLEG): container finished" podID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerID="0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3" exitCode=0 Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.971671 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.971658 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerDied","Data":"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3"} Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.972133 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e66513b9-5fbe-4c15-9e35-e35f2c12cd16","Type":"ContainerDied","Data":"5d39851aebdfe9e4ce53899720b26359df4bc73ab2d7ec3b9d2046a9b06c3526"} Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.972204 4618 scope.go:117] "RemoveContainer" containerID="0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3" Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.978799 4618 generic.go:334] "Generic (PLEG): container finished" podID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerID="ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0" exitCode=0 Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.978862 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerDied","Data":"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0"} Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.978903 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0d9730db-82be-44ff-87a8-23912cbdb99b","Type":"ContainerDied","Data":"476b78edff90f65dc9dc18ef0e530666342775360ec287eb67a2caefd74407f4"} Jan 21 09:21:01 crc kubenswrapper[4618]: I0121 09:21:01.978994 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.014012 4618 scope.go:117] "RemoveContainer" containerID="1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.022728 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.032432 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.042395 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.043260 4618 scope.go:117] "RemoveContainer" containerID="0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.047932 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3\": container with ID starting with 0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3 not found: ID does not exist" containerID="0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.047972 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3"} err="failed to get container status \"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3\": rpc error: code = NotFound desc = could not find container \"0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3\": container with ID starting with 0c177892193558d63a26deced2108d3c25659c7e713666165df3fe6360e6cdc3 not found: ID does not exist" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048001 4618 scope.go:117] "RemoveContainer" containerID="1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048117 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.048550 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048568 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.048586 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-api" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048592 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-api" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.048610 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048615 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.048631 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-log" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048636 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-log" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048847 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-metadata" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048873 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-log" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048886 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" containerName="nova-api-api" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.048900 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" containerName="nova-metadata-log" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.049068 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b\": container with ID starting with 1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b not found: ID does not exist" containerID="1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.049177 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b"} err="failed to get container status \"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b\": rpc error: code = NotFound desc = could not find container \"1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b\": container with ID starting with 1f4da59ac82e2218cd0f5520978957cd99f1cba47c9b15a669c7e582b6018e3b not found: ID does not exist" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.049269 4618 scope.go:117] "RemoveContainer" containerID="ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.049933 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.052501 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.055661 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.057856 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.094156 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.102421 4618 scope.go:117] "RemoveContainer" containerID="2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.160205 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.161941 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.165537 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.165741 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.165916 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.183282 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.221710 4618 scope.go:117] "RemoveContainer" containerID="ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.225217 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.232358 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0\": container with ID starting with ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0 not found: ID does not exist" containerID="ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.232417 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0"} err="failed to get container status \"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0\": rpc error: code = NotFound desc = could not find container \"ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0\": container with ID starting with ee6cd77ab5924a74095829597a7542096d06f66e9f7d013f110f093a97f326d0 not found: ID does not exist" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.232449 4618 scope.go:117] "RemoveContainer" containerID="2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5" Jan 21 09:21:02 crc kubenswrapper[4618]: E0121 09:21:02.236244 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5\": container with ID starting with 2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5 not found: ID does not exist" containerID="2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.236281 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5"} err="failed to get container status \"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5\": rpc error: code = NotFound desc = could not find container \"2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5\": container with ID starting with 2967998dcd26a6835b5aef70f065da4cb925e7d49cad858edae9a9ebfdcf48f5 not found: ID does not exist" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238419 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238465 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7g2\" (UniqueName: \"kubernetes.io/projected/9edd45d0-acce-46fa-b1d2-29ddc021d690-kube-api-access-hq7g2\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238511 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-config-data\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238545 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-config-data\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238631 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238684 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4j7c\" (UniqueName: \"kubernetes.io/projected/bcd77836-0c95-4165-8e69-9f1851be8f50-kube-api-access-g4j7c\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238737 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238804 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238826 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd45d0-acce-46fa-b1d2-29ddc021d690-logs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238890 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd77836-0c95-4165-8e69-9f1851be8f50-logs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.238963 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.340771 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-config-data\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341251 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-config-data\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341450 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4j7c\" (UniqueName: \"kubernetes.io/projected/bcd77836-0c95-4165-8e69-9f1851be8f50-kube-api-access-g4j7c\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341494 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341544 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341570 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd45d0-acce-46fa-b1d2-29ddc021d690-logs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341625 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd77836-0c95-4165-8e69-9f1851be8f50-logs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341689 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341779 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.341810 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7g2\" (UniqueName: \"kubernetes.io/projected/9edd45d0-acce-46fa-b1d2-29ddc021d690-kube-api-access-hq7g2\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.342821 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd45d0-acce-46fa-b1d2-29ddc021d690-logs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.343269 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd77836-0c95-4165-8e69-9f1851be8f50-logs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.347392 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.347500 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-config-data\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.348929 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.349096 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.349638 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd45d0-acce-46fa-b1d2-29ddc021d690-config-data\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.351766 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.352167 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcd77836-0c95-4165-8e69-9f1851be8f50-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.360684 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7g2\" (UniqueName: \"kubernetes.io/projected/9edd45d0-acce-46fa-b1d2-29ddc021d690-kube-api-access-hq7g2\") pod \"nova-api-0\" (UID: \"9edd45d0-acce-46fa-b1d2-29ddc021d690\") " pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.363364 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4j7c\" (UniqueName: \"kubernetes.io/projected/bcd77836-0c95-4165-8e69-9f1851be8f50-kube-api-access-g4j7c\") pod \"nova-metadata-0\" (UID: \"bcd77836-0c95-4165-8e69-9f1851be8f50\") " pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.371115 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.516328 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.768087 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.911923 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.997479 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcd77836-0c95-4165-8e69-9f1851be8f50","Type":"ContainerStarted","Data":"488e69090651e11a4ec3af3af30073ff31e6f30f2b40397cebaf44ce6edb7483"} Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.997530 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcd77836-0c95-4165-8e69-9f1851be8f50","Type":"ContainerStarted","Data":"77090c67209ce9fb5525ed482340c5d8e1506095ec4201be3964f71ead33d9ae"} Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.999933 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17677a46-bed7-4316-91a1-e7d842f83d91","Type":"ContainerStarted","Data":"ee13e4bc706e7e94f92ea6079a5bf4594f5a7e23b459637b78cc4732ae12f12d"} Jan 21 09:21:02 crc kubenswrapper[4618]: I0121 09:21:02.999972 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17677a46-bed7-4316-91a1-e7d842f83d91","Type":"ContainerStarted","Data":"5ed12aca98a6111d7399fd2d21ffb6a805e7eee764aa0dab91bc01947f9f3fe7"} Jan 21 09:21:03 crc kubenswrapper[4618]: I0121 09:21:03.001242 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd45d0-acce-46fa-b1d2-29ddc021d690","Type":"ContainerStarted","Data":"839323cdec7d9c4453402195c1e52a1d1c14046ae2d33c12edf6d36c52bee20a"} Jan 21 09:21:03 crc kubenswrapper[4618]: I0121 09:21:03.025942 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.025932047 podStartE2EDuration="3.025932047s" podCreationTimestamp="2026-01-21 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:21:03.018711878 +0000 UTC m=+1061.769179185" watchObservedRunningTime="2026-01-21 09:21:03.025932047 +0000 UTC m=+1061.776399355" Jan 21 09:21:03 crc kubenswrapper[4618]: I0121 09:21:03.550938 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d9730db-82be-44ff-87a8-23912cbdb99b" path="/var/lib/kubelet/pods/0d9730db-82be-44ff-87a8-23912cbdb99b/volumes" Jan 21 09:21:03 crc kubenswrapper[4618]: I0121 09:21:03.552036 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66513b9-5fbe-4c15-9e35-e35f2c12cd16" path="/var/lib/kubelet/pods/e66513b9-5fbe-4c15-9e35-e35f2c12cd16/volumes" Jan 21 09:21:04 crc kubenswrapper[4618]: I0121 09:21:04.017679 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd45d0-acce-46fa-b1d2-29ddc021d690","Type":"ContainerStarted","Data":"b34798487d4b54e96c62a90be9750948d798000c24d656b5acc52dace075f375"} Jan 21 09:21:04 crc kubenswrapper[4618]: I0121 09:21:04.018063 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd45d0-acce-46fa-b1d2-29ddc021d690","Type":"ContainerStarted","Data":"b4a4318b74f3aa875fced4596f5b259cb56b9f6f0ace1e9d5bb66f67b9dcbf7c"} Jan 21 09:21:04 crc kubenswrapper[4618]: I0121 09:21:04.024165 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcd77836-0c95-4165-8e69-9f1851be8f50","Type":"ContainerStarted","Data":"649c874e6fc5d4ac2ac0e049fe3095089b54e6d33ba0541278c9b95a951e4dae"} Jan 21 09:21:04 crc kubenswrapper[4618]: I0121 09:21:04.044883 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044865469 podStartE2EDuration="2.044865469s" podCreationTimestamp="2026-01-21 09:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:21:04.039108562 +0000 UTC m=+1062.789575879" watchObservedRunningTime="2026-01-21 09:21:04.044865469 +0000 UTC m=+1062.795332786" Jan 21 09:21:04 crc kubenswrapper[4618]: I0121 09:21:04.061045 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.061028878 podStartE2EDuration="2.061028878s" podCreationTimestamp="2026-01-21 09:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:21:04.054438304 +0000 UTC m=+1062.804905620" watchObservedRunningTime="2026-01-21 09:21:04.061028878 +0000 UTC m=+1062.811496196" Jan 21 09:21:06 crc kubenswrapper[4618]: I0121 09:21:06.617746 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 09:21:07 crc kubenswrapper[4618]: I0121 09:21:07.371850 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:21:07 crc kubenswrapper[4618]: I0121 09:21:07.371929 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 09:21:10 crc kubenswrapper[4618]: I0121 09:21:10.052331 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 09:21:11 crc kubenswrapper[4618]: I0121 09:21:11.618089 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 09:21:11 crc kubenswrapper[4618]: I0121 09:21:11.642053 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 09:21:12 crc kubenswrapper[4618]: I0121 09:21:12.134747 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 09:21:12 crc kubenswrapper[4618]: I0121 09:21:12.372709 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 09:21:12 crc kubenswrapper[4618]: I0121 09:21:12.373136 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 09:21:12 crc kubenswrapper[4618]: I0121 09:21:12.516857 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:21:12 crc kubenswrapper[4618]: I0121 09:21:12.516921 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 09:21:13 crc kubenswrapper[4618]: I0121 09:21:13.389302 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bcd77836-0c95-4165-8e69-9f1851be8f50" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 09:21:13 crc kubenswrapper[4618]: I0121 09:21:13.389299 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bcd77836-0c95-4165-8e69-9f1851be8f50" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:21:13 crc kubenswrapper[4618]: I0121 09:21:13.524322 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9edd45d0-acce-46fa-b1d2-29ddc021d690" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:21:13 crc kubenswrapper[4618]: I0121 09:21:13.529310 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9edd45d0-acce-46fa-b1d2-29ddc021d690" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.378508 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.380617 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.384559 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.523109 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.523868 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.523998 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 09:21:22 crc kubenswrapper[4618]: I0121 09:21:22.531244 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 09:21:23 crc kubenswrapper[4618]: I0121 09:21:23.213944 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 09:21:23 crc kubenswrapper[4618]: I0121 09:21:23.219579 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 09:21:23 crc kubenswrapper[4618]: I0121 09:21:23.221606 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 09:21:29 crc kubenswrapper[4618]: I0121 09:21:29.780574 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:30 crc kubenswrapper[4618]: I0121 09:21:30.506280 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:33 crc kubenswrapper[4618]: I0121 09:21:33.787085 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="rabbitmq" containerID="cri-o://51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec" gracePeriod=604796 Jan 21 09:21:34 crc kubenswrapper[4618]: I0121 09:21:34.421217 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="rabbitmq" containerID="cri-o://7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457" gracePeriod=604797 Jan 21 09:21:38 crc kubenswrapper[4618]: I0121 09:21:38.890044 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Jan 21 09:21:39 crc kubenswrapper[4618]: I0121 09:21:39.137354 4618 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.318558 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.392849 4618 generic.go:334] "Generic (PLEG): container finished" podID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerID="51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec" exitCode=0 Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.392887 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerDied","Data":"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec"} Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.392915 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0a9f652e-69d5-4c54-a3e8-9d926313e47d","Type":"ContainerDied","Data":"fe063e0551b772449512b0ec7d14ee2c33967b5fb57352f09f3c87f6f8b1e2ee"} Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.392931 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.392939 4618 scope.go:117] "RemoveContainer" containerID="51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.420205 4618 scope.go:117] "RemoveContainer" containerID="b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.442195 4618 scope.go:117] "RemoveContainer" containerID="51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec" Jan 21 09:21:40 crc kubenswrapper[4618]: E0121 09:21:40.442565 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec\": container with ID starting with 51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec not found: ID does not exist" containerID="51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.442626 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec"} err="failed to get container status \"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec\": rpc error: code = NotFound desc = could not find container \"51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec\": container with ID starting with 51d365f24a97bfbda4d0d9faae22c28df0538a7e37fe3db215094351b573eaec not found: ID does not exist" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.442659 4618 scope.go:117] "RemoveContainer" containerID="b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8" Jan 21 09:21:40 crc kubenswrapper[4618]: E0121 09:21:40.442962 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8\": container with ID starting with b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8 not found: ID does not exist" containerID="b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.442999 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8"} err="failed to get container status \"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8\": rpc error: code = NotFound desc = could not find container \"b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8\": container with ID starting with b5aff7ff83d6880308dd94176da84c03afbb8b269fac4dbc397c1a07a73cb3b8 not found: ID does not exist" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.463875 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.463925 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.463999 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.464035 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.464096 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.464175 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.464710 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.464874 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465138 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465195 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465241 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465270 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxtg\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465287 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie\") pod \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\" (UID: \"0a9f652e-69d5-4c54-a3e8-9d926313e47d\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465763 4618 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.465783 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.466761 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.471349 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.471511 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.472015 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.477687 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.482933 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg" (OuterVolumeSpecName: "kube-api-access-4dxtg") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "kube-api-access-4dxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.506617 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.519805 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data" (OuterVolumeSpecName: "config-data") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.555506 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a9f652e-69d5-4c54-a3e8-9d926313e47d" (UID: "0a9f652e-69d5-4c54-a3e8-9d926313e47d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568346 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568387 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568399 4618 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a9f652e-69d5-4c54-a3e8-9d926313e47d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568409 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxtg\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-kube-api-access-4dxtg\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568419 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568458 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568467 4618 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a9f652e-69d5-4c54-a3e8-9d926313e47d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568479 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a9f652e-69d5-4c54-a3e8-9d926313e47d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.568489 4618 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a9f652e-69d5-4c54-a3e8-9d926313e47d-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.588880 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.675441 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.734404 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.741710 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.756881 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:40 crc kubenswrapper[4618]: E0121 09:21:40.757273 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="rabbitmq" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.757290 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="rabbitmq" Jan 21 09:21:40 crc kubenswrapper[4618]: E0121 09:21:40.757329 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="setup-container" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.757335 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="setup-container" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.757508 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" containerName="rabbitmq" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.762376 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.764511 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.764754 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.764861 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.765315 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-677wz" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.765511 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.765623 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.765728 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.823513 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.835076 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.888870 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889065 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a257ccd-7e16-4450-810b-14a2dca56eab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889435 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a257ccd-7e16-4450-810b-14a2dca56eab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889501 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889584 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889652 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889699 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889722 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889759 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhbv\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-kube-api-access-rbhbv\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889781 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.889809 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991008 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991086 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991168 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991204 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991243 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7wv7\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991270 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991373 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991393 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991420 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991564 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.991627 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf\") pod \"1d8d0c9b-9097-462d-904e-7ff5126b1056\" (UID: \"1d8d0c9b-9097-462d-904e-7ff5126b1056\") " Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992072 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a257ccd-7e16-4450-810b-14a2dca56eab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992112 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992196 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992282 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992345 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.992370 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993479 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhbv\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-kube-api-access-rbhbv\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993534 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993556 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993624 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993656 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a257ccd-7e16-4450-810b-14a2dca56eab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.993992 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-config-data\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.999012 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info" (OuterVolumeSpecName: "pod-info") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 09:21:40 crc kubenswrapper[4618]: I0121 09:21:40.999647 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a257ccd-7e16-4450-810b-14a2dca56eab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.000023 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.006663 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.006944 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.007248 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.007812 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.008623 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.009720 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a257ccd-7e16-4450-810b-14a2dca56eab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.014743 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.014783 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.015934 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a257ccd-7e16-4450-810b-14a2dca56eab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.017001 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.019606 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7" (OuterVolumeSpecName: "kube-api-access-w7wv7") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "kube-api-access-w7wv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.020700 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.024497 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.039588 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.043913 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhbv\" (UniqueName: \"kubernetes.io/projected/1a257ccd-7e16-4450-810b-14a2dca56eab-kube-api-access-rbhbv\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.055614 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data" (OuterVolumeSpecName: "config-data") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.071357 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1a257ccd-7e16-4450-810b-14a2dca56eab\") " pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.095454 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:21:41 crc kubenswrapper[4618]: E0121 09:21:41.095924 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="rabbitmq" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.095937 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="rabbitmq" Jan 21 09:21:41 crc kubenswrapper[4618]: E0121 09:21:41.095953 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="setup-container" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.095959 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="setup-container" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.096135 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerName="rabbitmq" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.098369 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.098978 4618 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099004 4618 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d8d0c9b-9097-462d-904e-7ff5126b1056-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099014 4618 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d8d0c9b-9097-462d-904e-7ff5126b1056-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099023 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099034 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7wv7\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-kube-api-access-w7wv7\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099045 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099055 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099063 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.099084 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.103565 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.105961 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.112487 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.112661 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf" (OuterVolumeSpecName: "server-conf") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.125317 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.191246 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1d8d0c9b-9097-462d-904e-7ff5126b1056" (UID: "1d8d0c9b-9097-462d-904e-7ff5126b1056"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200374 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200496 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200559 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200613 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200638 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tj5g\" (UniqueName: \"kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200695 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200726 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200901 4618 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d8d0c9b-9097-462d-904e-7ff5126b1056-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200917 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.200928 4618 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d8d0c9b-9097-462d-904e-7ff5126b1056-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.302838 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303072 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303175 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303201 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tj5g\" (UniqueName: \"kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303264 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303294 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303495 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303925 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.303978 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.304678 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.304754 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.304697 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.305386 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.323604 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tj5g\" (UniqueName: \"kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g\") pod \"dnsmasq-dns-8595b94875-vlz4p\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.405488 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d8d0c9b-9097-462d-904e-7ff5126b1056" containerID="7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457" exitCode=0 Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.405561 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerDied","Data":"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457"} Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.405614 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.405644 4618 scope.go:117] "RemoveContainer" containerID="7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.405627 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d8d0c9b-9097-462d-904e-7ff5126b1056","Type":"ContainerDied","Data":"a220c0767364c77f9016a122d039a01b5ea44cbef776b971e42c437447a464e5"} Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.443642 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.447576 4618 scope.go:117] "RemoveContainer" containerID="84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.449181 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.472871 4618 scope.go:117] "RemoveContainer" containerID="7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457" Jan 21 09:21:41 crc kubenswrapper[4618]: E0121 09:21:41.473291 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457\": container with ID starting with 7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457 not found: ID does not exist" containerID="7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.473328 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457"} err="failed to get container status \"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457\": rpc error: code = NotFound desc = could not find container \"7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457\": container with ID starting with 7d21b614e2ffdac341d6f4ae4ab5e51428834e6ce951d6b3aac5fd52e21e8457 not found: ID does not exist" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.473351 4618 scope.go:117] "RemoveContainer" containerID="84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea" Jan 21 09:21:41 crc kubenswrapper[4618]: E0121 09:21:41.473693 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea\": container with ID starting with 84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea not found: ID does not exist" containerID="84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.473736 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea"} err="failed to get container status \"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea\": rpc error: code = NotFound desc = could not find container \"84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea\": container with ID starting with 84e338c84f18d1167e4bc7d5cbcbbb0fd139711c8b1acdfef6d01d4daf51ebea not found: ID does not exist" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.475232 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.476989 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.479126 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.479281 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.480582 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.480757 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2sgc6" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.480881 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.481010 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.481112 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.487265 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.536766 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.549367 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9f652e-69d5-4c54-a3e8-9d926313e47d" path="/var/lib/kubelet/pods/0a9f652e-69d5-4c54-a3e8-9d926313e47d/volumes" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.550200 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8d0c9b-9097-462d-904e-7ff5126b1056" path="/var/lib/kubelet/pods/1d8d0c9b-9097-462d-904e-7ff5126b1056/volumes" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.554172 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610279 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610385 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610413 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610433 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6594f517-1fec-47c9-909d-674c8a7f36dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610559 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6594f517-1fec-47c9-909d-674c8a7f36dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610590 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610614 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610639 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610660 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8br\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-kube-api-access-ck8br\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610696 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.610720 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713024 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713115 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713137 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713169 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6594f517-1fec-47c9-909d-674c8a7f36dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713229 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6594f517-1fec-47c9-909d-674c8a7f36dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713254 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713275 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713295 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713314 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8br\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-kube-api-access-ck8br\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713336 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.713351 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.714362 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.714471 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.714680 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.715788 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.716383 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.716429 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6594f517-1fec-47c9-909d-674c8a7f36dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.717837 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6594f517-1fec-47c9-909d-674c8a7f36dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.717984 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.718391 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.720898 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6594f517-1fec-47c9-909d-674c8a7f36dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.727648 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8br\" (UniqueName: \"kubernetes.io/projected/6594f517-1fec-47c9-909d-674c8a7f36dd-kube-api-access-ck8br\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.752474 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6594f517-1fec-47c9-909d-674c8a7f36dd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.801196 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:21:41 crc kubenswrapper[4618]: I0121 09:21:41.964673 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.204394 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.432029 4618 generic.go:334] "Generic (PLEG): container finished" podID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerID="e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499" exitCode=0 Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.432213 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" event={"ID":"2e5171af-4752-4883-88ee-a9be2a6affbb","Type":"ContainerDied","Data":"e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499"} Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.432275 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" event={"ID":"2e5171af-4752-4883-88ee-a9be2a6affbb","Type":"ContainerStarted","Data":"ffa7e7c500c7904a59ae7df362d037f32df922c6f733aae5a6cac0635d2c935a"} Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.438241 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6594f517-1fec-47c9-909d-674c8a7f36dd","Type":"ContainerStarted","Data":"e25c32cbbf362fb2d758d50096eec7df08a49c885dcf0a83cd12648eec31241c"} Jan 21 09:21:42 crc kubenswrapper[4618]: I0121 09:21:42.440760 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a257ccd-7e16-4450-810b-14a2dca56eab","Type":"ContainerStarted","Data":"2b5759da8696614e357cd47e9bb5c1c4242e7ca03e9f42d8a9bef3ba3e9359b1"} Jan 21 09:21:43 crc kubenswrapper[4618]: I0121 09:21:43.454787 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" event={"ID":"2e5171af-4752-4883-88ee-a9be2a6affbb","Type":"ContainerStarted","Data":"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543"} Jan 21 09:21:43 crc kubenswrapper[4618]: I0121 09:21:43.455257 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:43 crc kubenswrapper[4618]: I0121 09:21:43.457291 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a257ccd-7e16-4450-810b-14a2dca56eab","Type":"ContainerStarted","Data":"4d1977ef800dab59cfab2878e3fabb0fede733a4c04a275619567a1c58429040"} Jan 21 09:21:43 crc kubenswrapper[4618]: I0121 09:21:43.481449 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" podStartSLOduration=2.481420894 podStartE2EDuration="2.481420894s" podCreationTimestamp="2026-01-21 09:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:21:43.474559809 +0000 UTC m=+1102.225027126" watchObservedRunningTime="2026-01-21 09:21:43.481420894 +0000 UTC m=+1102.231888211" Jan 21 09:21:44 crc kubenswrapper[4618]: I0121 09:21:44.473472 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6594f517-1fec-47c9-909d-674c8a7f36dd","Type":"ContainerStarted","Data":"fdb5dfe6b334afbb03bd9b3c0b2249040f90eafbfac5d8d55cce10a1728ec13a"} Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.548557 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.614229 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.614478 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="dnsmasq-dns" containerID="cri-o://271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464" gracePeriod=10 Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.732750 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d7b79b84c-h9xrw"] Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.734418 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.752063 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7b79b84c-h9xrw"] Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.850525 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bdlr\" (UniqueName: \"kubernetes.io/projected/9f529ba0-9024-4b63-8d19-bb798710ce6f-kube-api-access-6bdlr\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.850922 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-swift-storage-0\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.850994 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-config\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.851091 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.851195 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-sb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.851428 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-svc\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.851494 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-nb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.952939 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-svc\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953006 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-nb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953219 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdlr\" (UniqueName: \"kubernetes.io/projected/9f529ba0-9024-4b63-8d19-bb798710ce6f-kube-api-access-6bdlr\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953323 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-swift-storage-0\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953345 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-config\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953381 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953431 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-sb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.953924 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-svc\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.954181 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-dns-swift-storage-0\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.954208 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-config\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.956040 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-nb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.957325 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.957396 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f529ba0-9024-4b63-8d19-bb798710ce6f-ovsdbserver-sb\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:51 crc kubenswrapper[4618]: I0121 09:21:51.975707 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdlr\" (UniqueName: \"kubernetes.io/projected/9f529ba0-9024-4b63-8d19-bb798710ce6f-kube-api-access-6bdlr\") pod \"dnsmasq-dns-d7b79b84c-h9xrw\" (UID: \"9f529ba0-9024-4b63-8d19-bb798710ce6f\") " pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.065730 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.085089 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156586 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz84c\" (UniqueName: \"kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156638 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156660 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156683 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156772 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.156830 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb\") pod \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\" (UID: \"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d\") " Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.162570 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c" (OuterVolumeSpecName: "kube-api-access-rz84c") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "kube-api-access-rz84c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.200115 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config" (OuterVolumeSpecName: "config") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.206095 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.210241 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.213866 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.217702 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" (UID: "bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265425 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz84c\" (UniqueName: \"kubernetes.io/projected/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-kube-api-access-rz84c\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265450 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265462 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265473 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265483 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.265493 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.476948 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7b79b84c-h9xrw"] Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.559647 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" event={"ID":"9f529ba0-9024-4b63-8d19-bb798710ce6f","Type":"ContainerStarted","Data":"124fa400cff802f5272abe98150797e5af9a78fc725bef54af701bb9cd5932c5"} Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.562605 4618 generic.go:334] "Generic (PLEG): container finished" podID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerID="271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464" exitCode=0 Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.562677 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" event={"ID":"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d","Type":"ContainerDied","Data":"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464"} Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.562733 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.562753 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-4cztp" event={"ID":"bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d","Type":"ContainerDied","Data":"402b9bdd57418b4292b1002d8013dfa3410cf0edbbb091b4ea1ae977dd81a92f"} Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.562785 4618 scope.go:117] "RemoveContainer" containerID="271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.608098 4618 scope.go:117] "RemoveContainer" containerID="3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.612064 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.622837 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-4cztp"] Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.635591 4618 scope.go:117] "RemoveContainer" containerID="271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464" Jan 21 09:21:52 crc kubenswrapper[4618]: E0121 09:21:52.636097 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464\": container with ID starting with 271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464 not found: ID does not exist" containerID="271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.636130 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464"} err="failed to get container status \"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464\": rpc error: code = NotFound desc = could not find container \"271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464\": container with ID starting with 271f19052cc56c71e028fe068673e57b9f21065ad5e4d373ae4bff27e5287464 not found: ID does not exist" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.636170 4618 scope.go:117] "RemoveContainer" containerID="3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d" Jan 21 09:21:52 crc kubenswrapper[4618]: E0121 09:21:52.636490 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d\": container with ID starting with 3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d not found: ID does not exist" containerID="3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d" Jan 21 09:21:52 crc kubenswrapper[4618]: I0121 09:21:52.636509 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d"} err="failed to get container status \"3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d\": rpc error: code = NotFound desc = could not find container \"3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d\": container with ID starting with 3439780ad22229c1c4c09b82ed8272951367e43efd361d0a6800db81605c5b0d not found: ID does not exist" Jan 21 09:21:53 crc kubenswrapper[4618]: I0121 09:21:53.548853 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" path="/var/lib/kubelet/pods/bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d/volumes" Jan 21 09:21:53 crc kubenswrapper[4618]: I0121 09:21:53.574952 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f529ba0-9024-4b63-8d19-bb798710ce6f" containerID="06ea3d3c16138b90e7cb8b08aaac6f3eabdb5f49a6ceb9575793ee46154f4ac4" exitCode=0 Jan 21 09:21:53 crc kubenswrapper[4618]: I0121 09:21:53.575002 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" event={"ID":"9f529ba0-9024-4b63-8d19-bb798710ce6f","Type":"ContainerDied","Data":"06ea3d3c16138b90e7cb8b08aaac6f3eabdb5f49a6ceb9575793ee46154f4ac4"} Jan 21 09:21:54 crc kubenswrapper[4618]: I0121 09:21:54.586192 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" event={"ID":"9f529ba0-9024-4b63-8d19-bb798710ce6f","Type":"ContainerStarted","Data":"8a2b138c9662ac0929546ffc86e0a190bdd0aaaf4b3bf673ada95a5f074af536"} Jan 21 09:21:54 crc kubenswrapper[4618]: I0121 09:21:54.587454 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:21:54 crc kubenswrapper[4618]: I0121 09:21:54.607235 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" podStartSLOduration=3.607219449 podStartE2EDuration="3.607219449s" podCreationTimestamp="2026-01-21 09:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:21:54.603569484 +0000 UTC m=+1113.354036791" watchObservedRunningTime="2026-01-21 09:21:54.607219449 +0000 UTC m=+1113.357686767" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.087205 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d7b79b84c-h9xrw" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.163672 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.164344 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="dnsmasq-dns" containerID="cri-o://539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543" gracePeriod=10 Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.580002 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.681118 4618 generic.go:334] "Generic (PLEG): container finished" podID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerID="539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543" exitCode=0 Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.681212 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.681314 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" event={"ID":"2e5171af-4752-4883-88ee-a9be2a6affbb","Type":"ContainerDied","Data":"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543"} Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.681406 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8595b94875-vlz4p" event={"ID":"2e5171af-4752-4883-88ee-a9be2a6affbb","Type":"ContainerDied","Data":"ffa7e7c500c7904a59ae7df362d037f32df922c6f733aae5a6cac0635d2c935a"} Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.681470 4618 scope.go:117] "RemoveContainer" containerID="539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.701891 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702222 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702349 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702401 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702499 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702535 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.702612 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tj5g\" (UniqueName: \"kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g\") pod \"2e5171af-4752-4883-88ee-a9be2a6affbb\" (UID: \"2e5171af-4752-4883-88ee-a9be2a6affbb\") " Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.706374 4618 scope.go:117] "RemoveContainer" containerID="e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.724855 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g" (OuterVolumeSpecName: "kube-api-access-8tj5g") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "kube-api-access-8tj5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.749951 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config" (OuterVolumeSpecName: "config") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.749628 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.750394 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.752576 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.753425 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.754531 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e5171af-4752-4883-88ee-a9be2a6affbb" (UID: "2e5171af-4752-4883-88ee-a9be2a6affbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807359 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807402 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807415 4618 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807429 4618 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807438 4618 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807448 4618 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e5171af-4752-4883-88ee-a9be2a6affbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.807458 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tj5g\" (UniqueName: \"kubernetes.io/projected/2e5171af-4752-4883-88ee-a9be2a6affbb-kube-api-access-8tj5g\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.808093 4618 scope.go:117] "RemoveContainer" containerID="539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543" Jan 21 09:22:02 crc kubenswrapper[4618]: E0121 09:22:02.808645 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543\": container with ID starting with 539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543 not found: ID does not exist" containerID="539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.808676 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543"} err="failed to get container status \"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543\": rpc error: code = NotFound desc = could not find container \"539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543\": container with ID starting with 539d3709ea3e752716a5e1aa37eb67cd8033eecb9a424eeb2186772551260543 not found: ID does not exist" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.808714 4618 scope.go:117] "RemoveContainer" containerID="e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499" Jan 21 09:22:02 crc kubenswrapper[4618]: E0121 09:22:02.809320 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499\": container with ID starting with e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499 not found: ID does not exist" containerID="e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499" Jan 21 09:22:02 crc kubenswrapper[4618]: I0121 09:22:02.809364 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499"} err="failed to get container status \"e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499\": rpc error: code = NotFound desc = could not find container \"e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499\": container with ID starting with e9de719ad82cef0cac062a09ebe596cf28bc09899b8b1f5323cd0f38049ce499 not found: ID does not exist" Jan 21 09:22:03 crc kubenswrapper[4618]: I0121 09:22:03.016331 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:22:03 crc kubenswrapper[4618]: I0121 09:22:03.023601 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8595b94875-vlz4p"] Jan 21 09:22:03 crc kubenswrapper[4618]: I0121 09:22:03.548907 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" path="/var/lib/kubelet/pods/2e5171af-4752-4883-88ee-a9be2a6affbb/volumes" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.747784 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz"] Jan 21 09:22:10 crc kubenswrapper[4618]: E0121 09:22:10.748764 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.748781 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: E0121 09:22:10.748823 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="init" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.748829 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="init" Jan 21 09:22:10 crc kubenswrapper[4618]: E0121 09:22:10.748845 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.748851 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: E0121 09:22:10.748864 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="init" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.748869 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="init" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.749061 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb7bafe-9ad7-4ecf-9bd8-90c11bddc43d" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.749092 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5171af-4752-4883-88ee-a9be2a6affbb" containerName="dnsmasq-dns" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.749766 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.752055 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.752972 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.753203 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.753326 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.776557 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz"] Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.889855 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.889920 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.889954 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.891255 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55fj\" (UniqueName: \"kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.992555 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.992609 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.992645 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:10 crc kubenswrapper[4618]: I0121 09:22:10.992750 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l55fj\" (UniqueName: \"kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.000091 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.001361 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.001859 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.009645 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55fj\" (UniqueName: \"kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.092961 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.586801 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz"] Jan 21 09:22:11 crc kubenswrapper[4618]: W0121 09:22:11.589109 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac42dc63_60fa_42fe_8497_f7164e407083.slice/crio-aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929 WatchSource:0}: Error finding container aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929: Status 404 returned error can't find the container with id aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929 Jan 21 09:22:11 crc kubenswrapper[4618]: I0121 09:22:11.789475 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" event={"ID":"ac42dc63-60fa-42fe-8497-f7164e407083","Type":"ContainerStarted","Data":"aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929"} Jan 21 09:22:14 crc kubenswrapper[4618]: I0121 09:22:14.826607 4618 generic.go:334] "Generic (PLEG): container finished" podID="1a257ccd-7e16-4450-810b-14a2dca56eab" containerID="4d1977ef800dab59cfab2878e3fabb0fede733a4c04a275619567a1c58429040" exitCode=0 Jan 21 09:22:14 crc kubenswrapper[4618]: I0121 09:22:14.826677 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a257ccd-7e16-4450-810b-14a2dca56eab","Type":"ContainerDied","Data":"4d1977ef800dab59cfab2878e3fabb0fede733a4c04a275619567a1c58429040"} Jan 21 09:22:15 crc kubenswrapper[4618]: I0121 09:22:15.840855 4618 generic.go:334] "Generic (PLEG): container finished" podID="6594f517-1fec-47c9-909d-674c8a7f36dd" containerID="fdb5dfe6b334afbb03bd9b3c0b2249040f90eafbfac5d8d55cce10a1728ec13a" exitCode=0 Jan 21 09:22:15 crc kubenswrapper[4618]: I0121 09:22:15.840929 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6594f517-1fec-47c9-909d-674c8a7f36dd","Type":"ContainerDied","Data":"fdb5dfe6b334afbb03bd9b3c0b2249040f90eafbfac5d8d55cce10a1728ec13a"} Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.881949 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" event={"ID":"ac42dc63-60fa-42fe-8497-f7164e407083","Type":"ContainerStarted","Data":"97a69c6a8028a9d0693b6e75637bb4b59466a88f7194dd36f409d66438006fcc"} Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.886359 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6594f517-1fec-47c9-909d-674c8a7f36dd","Type":"ContainerStarted","Data":"7985c0c7225ace6272342b2e5e1b00a2b16a8d5067daf3297d0b195e9bfd115a"} Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.886676 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.888845 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a257ccd-7e16-4450-810b-14a2dca56eab","Type":"ContainerStarted","Data":"0e20cccb672d73bb85ac311f5bef1db20a1d57b2c6c26ae445527f0affcc078e"} Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.889050 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.901776 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" podStartSLOduration=2.668054282 podStartE2EDuration="9.901760353s" podCreationTimestamp="2026-01-21 09:22:10 +0000 UTC" firstStartedPulling="2026-01-21 09:22:11.591711794 +0000 UTC m=+1130.342179111" lastFinishedPulling="2026-01-21 09:22:18.825417865 +0000 UTC m=+1137.575885182" observedRunningTime="2026-01-21 09:22:19.898091442 +0000 UTC m=+1138.648558759" watchObservedRunningTime="2026-01-21 09:22:19.901760353 +0000 UTC m=+1138.652227671" Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.921815 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.92179002 podStartE2EDuration="39.92179002s" podCreationTimestamp="2026-01-21 09:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:22:19.916538471 +0000 UTC m=+1138.667005788" watchObservedRunningTime="2026-01-21 09:22:19.92179002 +0000 UTC m=+1138.672257337" Jan 21 09:22:19 crc kubenswrapper[4618]: I0121 09:22:19.946756 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.94673428 podStartE2EDuration="38.94673428s" podCreationTimestamp="2026-01-21 09:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:22:19.938015109 +0000 UTC m=+1138.688482427" watchObservedRunningTime="2026-01-21 09:22:19.94673428 +0000 UTC m=+1138.697201598" Jan 21 09:22:29 crc kubenswrapper[4618]: I0121 09:22:29.987819 4618 generic.go:334] "Generic (PLEG): container finished" podID="ac42dc63-60fa-42fe-8497-f7164e407083" containerID="97a69c6a8028a9d0693b6e75637bb4b59466a88f7194dd36f409d66438006fcc" exitCode=0 Jan 21 09:22:29 crc kubenswrapper[4618]: I0121 09:22:29.987925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" event={"ID":"ac42dc63-60fa-42fe-8497-f7164e407083","Type":"ContainerDied","Data":"97a69c6a8028a9d0693b6e75637bb4b59466a88f7194dd36f409d66438006fcc"} Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.107219 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.375570 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.516555 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory\") pod \"ac42dc63-60fa-42fe-8497-f7164e407083\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.516805 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l55fj\" (UniqueName: \"kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj\") pod \"ac42dc63-60fa-42fe-8497-f7164e407083\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.516881 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle\") pod \"ac42dc63-60fa-42fe-8497-f7164e407083\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.517014 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam\") pod \"ac42dc63-60fa-42fe-8497-f7164e407083\" (UID: \"ac42dc63-60fa-42fe-8497-f7164e407083\") " Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.533255 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ac42dc63-60fa-42fe-8497-f7164e407083" (UID: "ac42dc63-60fa-42fe-8497-f7164e407083"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.533393 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj" (OuterVolumeSpecName: "kube-api-access-l55fj") pod "ac42dc63-60fa-42fe-8497-f7164e407083" (UID: "ac42dc63-60fa-42fe-8497-f7164e407083"). InnerVolumeSpecName "kube-api-access-l55fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.545159 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory" (OuterVolumeSpecName: "inventory") pod "ac42dc63-60fa-42fe-8497-f7164e407083" (UID: "ac42dc63-60fa-42fe-8497-f7164e407083"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.553101 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac42dc63-60fa-42fe-8497-f7164e407083" (UID: "ac42dc63-60fa-42fe-8497-f7164e407083"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.620283 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.620317 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l55fj\" (UniqueName: \"kubernetes.io/projected/ac42dc63-60fa-42fe-8497-f7164e407083-kube-api-access-l55fj\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.620334 4618 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.620348 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac42dc63-60fa-42fe-8497-f7164e407083-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:31 crc kubenswrapper[4618]: I0121 09:22:31.804328 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.011911 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" event={"ID":"ac42dc63-60fa-42fe-8497-f7164e407083","Type":"ContainerDied","Data":"aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929"} Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.011954 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeaf34f0a08dcb60722cacb9f9b71cf9de6ba13e87703daa37c85b30d4f29929" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.011973 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.079589 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs"] Jan 21 09:22:32 crc kubenswrapper[4618]: E0121 09:22:32.080085 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac42dc63-60fa-42fe-8497-f7164e407083" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.080107 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac42dc63-60fa-42fe-8497-f7164e407083" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.080417 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac42dc63-60fa-42fe-8497-f7164e407083" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.081162 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.083209 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.083314 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.085050 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.085322 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.088906 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs"] Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.241119 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.241221 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.241256 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tz9\" (UniqueName: \"kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.343587 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.343743 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.344585 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tz9\" (UniqueName: \"kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.348460 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.348675 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.367512 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tz9\" (UniqueName: \"kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qg6fs\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.394608 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.871942 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs"] Jan 21 09:22:32 crc kubenswrapper[4618]: W0121 09:22:32.875554 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d012e11_c226_4c6f_b646_6358036a6924.slice/crio-e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03 WatchSource:0}: Error finding container e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03: Status 404 returned error can't find the container with id e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03 Jan 21 09:22:32 crc kubenswrapper[4618]: I0121 09:22:32.878489 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:22:33 crc kubenswrapper[4618]: I0121 09:22:33.023359 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" event={"ID":"1d012e11-c226-4c6f-b646-6358036a6924","Type":"ContainerStarted","Data":"e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03"} Jan 21 09:22:34 crc kubenswrapper[4618]: I0121 09:22:34.034875 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" event={"ID":"1d012e11-c226-4c6f-b646-6358036a6924","Type":"ContainerStarted","Data":"c1002f9f44f8fbe2ace191c2219f6e6849820f2b7144504074ba818addd6ca08"} Jan 21 09:22:34 crc kubenswrapper[4618]: I0121 09:22:34.057007 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" podStartSLOduration=1.575480109 podStartE2EDuration="2.056985152s" podCreationTimestamp="2026-01-21 09:22:32 +0000 UTC" firstStartedPulling="2026-01-21 09:22:32.87823959 +0000 UTC m=+1151.628706907" lastFinishedPulling="2026-01-21 09:22:33.359744634 +0000 UTC m=+1152.110211950" observedRunningTime="2026-01-21 09:22:34.051897291 +0000 UTC m=+1152.802364608" watchObservedRunningTime="2026-01-21 09:22:34.056985152 +0000 UTC m=+1152.807452469" Jan 21 09:22:36 crc kubenswrapper[4618]: I0121 09:22:36.055362 4618 generic.go:334] "Generic (PLEG): container finished" podID="1d012e11-c226-4c6f-b646-6358036a6924" containerID="c1002f9f44f8fbe2ace191c2219f6e6849820f2b7144504074ba818addd6ca08" exitCode=0 Jan 21 09:22:36 crc kubenswrapper[4618]: I0121 09:22:36.055435 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" event={"ID":"1d012e11-c226-4c6f-b646-6358036a6924","Type":"ContainerDied","Data":"c1002f9f44f8fbe2ace191c2219f6e6849820f2b7144504074ba818addd6ca08"} Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.370896 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.569711 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory\") pod \"1d012e11-c226-4c6f-b646-6358036a6924\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.570073 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5tz9\" (UniqueName: \"kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9\") pod \"1d012e11-c226-4c6f-b646-6358036a6924\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.570603 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam\") pod \"1d012e11-c226-4c6f-b646-6358036a6924\" (UID: \"1d012e11-c226-4c6f-b646-6358036a6924\") " Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.576176 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9" (OuterVolumeSpecName: "kube-api-access-v5tz9") pod "1d012e11-c226-4c6f-b646-6358036a6924" (UID: "1d012e11-c226-4c6f-b646-6358036a6924"). InnerVolumeSpecName "kube-api-access-v5tz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.594356 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d012e11-c226-4c6f-b646-6358036a6924" (UID: "1d012e11-c226-4c6f-b646-6358036a6924"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.596715 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory" (OuterVolumeSpecName: "inventory") pod "1d012e11-c226-4c6f-b646-6358036a6924" (UID: "1d012e11-c226-4c6f-b646-6358036a6924"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.674091 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.674130 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d012e11-c226-4c6f-b646-6358036a6924-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:37 crc kubenswrapper[4618]: I0121 09:22:37.674163 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5tz9\" (UniqueName: \"kubernetes.io/projected/1d012e11-c226-4c6f-b646-6358036a6924-kube-api-access-v5tz9\") on node \"crc\" DevicePath \"\"" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.077472 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" event={"ID":"1d012e11-c226-4c6f-b646-6358036a6924","Type":"ContainerDied","Data":"e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03"} Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.077534 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qg6fs" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.077540 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e7583333524ffbf96cf059d13e255eb1dde259d129fb851a3368244fb3cd03" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.143694 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn"] Jan 21 09:22:38 crc kubenswrapper[4618]: E0121 09:22:38.144212 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d012e11-c226-4c6f-b646-6358036a6924" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.144232 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d012e11-c226-4c6f-b646-6358036a6924" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.144458 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d012e11-c226-4c6f-b646-6358036a6924" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.145286 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.147236 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.147486 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.148079 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.149379 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.163059 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn"] Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.186318 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.186525 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.186646 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.186677 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2dls\" (UniqueName: \"kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.288911 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.288982 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2dls\" (UniqueName: \"kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.289085 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.289412 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.295099 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.295312 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.295693 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.303672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2dls\" (UniqueName: \"kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.464773 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:22:38 crc kubenswrapper[4618]: I0121 09:22:38.936128 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn"] Jan 21 09:22:38 crc kubenswrapper[4618]: W0121 09:22:38.937032 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod194d2dc3_6b5c_4cdf_aa65_3d6c088d1e90.slice/crio-3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f WatchSource:0}: Error finding container 3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f: Status 404 returned error can't find the container with id 3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f Jan 21 09:22:39 crc kubenswrapper[4618]: I0121 09:22:39.091430 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" event={"ID":"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90","Type":"ContainerStarted","Data":"3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f"} Jan 21 09:22:40 crc kubenswrapper[4618]: I0121 09:22:40.104367 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" event={"ID":"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90","Type":"ContainerStarted","Data":"fb7fae8028c50d6d03ee67e7bb8122a7f27d41aade082ecd7c639d6b7ca40360"} Jan 21 09:22:40 crc kubenswrapper[4618]: I0121 09:22:40.128196 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" podStartSLOduration=1.645101455 podStartE2EDuration="2.128169258s" podCreationTimestamp="2026-01-21 09:22:38 +0000 UTC" firstStartedPulling="2026-01-21 09:22:38.939111799 +0000 UTC m=+1157.689579116" lastFinishedPulling="2026-01-21 09:22:39.422179602 +0000 UTC m=+1158.172646919" observedRunningTime="2026-01-21 09:22:40.118805524 +0000 UTC m=+1158.869272841" watchObservedRunningTime="2026-01-21 09:22:40.128169258 +0000 UTC m=+1158.878636574" Jan 21 09:23:26 crc kubenswrapper[4618]: I0121 09:23:26.958895 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:23:26 crc kubenswrapper[4618]: I0121 09:23:26.959551 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:23:27 crc kubenswrapper[4618]: I0121 09:23:27.131789 4618 scope.go:117] "RemoveContainer" containerID="eb8a1155b53479e61967646e83499e2dd931bdd152bd2de1af221c9cafafdb08" Jan 21 09:23:56 crc kubenswrapper[4618]: I0121 09:23:56.959309 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:23:56 crc kubenswrapper[4618]: I0121 09:23:56.960008 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:24:26 crc kubenswrapper[4618]: I0121 09:24:26.959514 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:24:26 crc kubenswrapper[4618]: I0121 09:24:26.960132 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:24:26 crc kubenswrapper[4618]: I0121 09:24:26.960222 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:24:26 crc kubenswrapper[4618]: I0121 09:24:26.961207 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:24:26 crc kubenswrapper[4618]: I0121 09:24:26.961272 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b" gracePeriod=600 Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.152031 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b" exitCode=0 Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.152122 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b"} Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.152316 4618 scope.go:117] "RemoveContainer" containerID="ba02e5d7a9b981ad1d2210de45ad9384cd8e1c52599c2747b664a4d50ae9a210" Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.197463 4618 scope.go:117] "RemoveContainer" containerID="a1b897c6d1a799b4df6a433b824601689bfbad2a1b7ee1e32838121e14b0cb39" Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.240219 4618 scope.go:117] "RemoveContainer" containerID="64e52c4c510a390b62d51e3ab84eba1a7d641d00c45adfa8f31062a12177b595" Jan 21 09:24:27 crc kubenswrapper[4618]: I0121 09:24:27.279673 4618 scope.go:117] "RemoveContainer" containerID="5cc4bf939c84d0a22200fdfb0482fd73589dcd02a395b7df41af44905590b5cb" Jan 21 09:24:28 crc kubenswrapper[4618]: I0121 09:24:28.161997 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6"} Jan 21 09:25:27 crc kubenswrapper[4618]: I0121 09:25:27.349925 4618 scope.go:117] "RemoveContainer" containerID="aaf3a6579f555c617b640a91a299b26611326023d8ce349fe689f0dfe10c0299" Jan 21 09:25:27 crc kubenswrapper[4618]: I0121 09:25:27.375361 4618 scope.go:117] "RemoveContainer" containerID="9ad43541e15c378bd8b57317352791a435129bdd2c01ab5aa240a790edf94567" Jan 21 09:25:27 crc kubenswrapper[4618]: I0121 09:25:27.410263 4618 scope.go:117] "RemoveContainer" containerID="55c53d9ae857fa8bfcc058674e7b2eb54bb50d4f6d172488443ab4da2fbbb02d" Jan 21 09:25:27 crc kubenswrapper[4618]: I0121 09:25:27.440385 4618 scope.go:117] "RemoveContainer" containerID="b4030d3306f46a34332c6f7410af9b79ae2faab4495a8e8cc91750cb850a5769" Jan 21 09:25:36 crc kubenswrapper[4618]: I0121 09:25:36.805312 4618 generic.go:334] "Generic (PLEG): container finished" podID="194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" containerID="fb7fae8028c50d6d03ee67e7bb8122a7f27d41aade082ecd7c639d6b7ca40360" exitCode=0 Jan 21 09:25:36 crc kubenswrapper[4618]: I0121 09:25:36.805401 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" event={"ID":"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90","Type":"ContainerDied","Data":"fb7fae8028c50d6d03ee67e7bb8122a7f27d41aade082ecd7c639d6b7ca40360"} Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.149589 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.336089 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory\") pod \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.336543 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam\") pod \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.337308 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2dls\" (UniqueName: \"kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls\") pod \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.337498 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle\") pod \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\" (UID: \"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90\") " Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.343066 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls" (OuterVolumeSpecName: "kube-api-access-q2dls") pod "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" (UID: "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90"). InnerVolumeSpecName "kube-api-access-q2dls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.343173 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" (UID: "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.364037 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" (UID: "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.379895 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory" (OuterVolumeSpecName: "inventory") pod "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" (UID: "194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.440445 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2dls\" (UniqueName: \"kubernetes.io/projected/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-kube-api-access-q2dls\") on node \"crc\" DevicePath \"\"" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.440475 4618 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.440489 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.440503 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.831192 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" event={"ID":"194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90","Type":"ContainerDied","Data":"3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f"} Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.831260 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7e8d9df27a3714e41a1a8521af9842911762db6115abc8b8c9cc91dce6226f" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.831257 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.900239 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2"] Jan 21 09:25:38 crc kubenswrapper[4618]: E0121 09:25:38.900809 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.900830 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.901080 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.901867 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.904184 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.904248 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.904194 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.904606 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.908421 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2"] Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.953065 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.953312 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4l9h\" (UniqueName: \"kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:38 crc kubenswrapper[4618]: I0121 09:25:38.953506 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.055970 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4l9h\" (UniqueName: \"kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.056177 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.056302 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.063762 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.063786 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.070807 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4l9h\" (UniqueName: \"kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.216762 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.672483 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2"] Jan 21 09:25:39 crc kubenswrapper[4618]: I0121 09:25:39.842197 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" event={"ID":"9fab9896-c90d-47af-9a73-4cf53b19d631","Type":"ContainerStarted","Data":"94d4ab4e4b6180be484c89b3bd531f50d0dfa082ce9517e834dee884bec30907"} Jan 21 09:25:40 crc kubenswrapper[4618]: I0121 09:25:40.849177 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" event={"ID":"9fab9896-c90d-47af-9a73-4cf53b19d631","Type":"ContainerStarted","Data":"b1df518094348309a98e961e22d0edbdc916a8a52109e4f9748c6437cda6678c"} Jan 21 09:25:40 crc kubenswrapper[4618]: I0121 09:25:40.869131 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" podStartSLOduration=2.341025082 podStartE2EDuration="2.869115827s" podCreationTimestamp="2026-01-21 09:25:38 +0000 UTC" firstStartedPulling="2026-01-21 09:25:39.676955797 +0000 UTC m=+1338.427423114" lastFinishedPulling="2026-01-21 09:25:40.205046542 +0000 UTC m=+1338.955513859" observedRunningTime="2026-01-21 09:25:40.864345003 +0000 UTC m=+1339.614812321" watchObservedRunningTime="2026-01-21 09:25:40.869115827 +0000 UTC m=+1339.619583143" Jan 21 09:26:27 crc kubenswrapper[4618]: I0121 09:26:27.514377 4618 scope.go:117] "RemoveContainer" containerID="3e2d834ff9dc0893d333058d6573f7d9b5b3eddb14271e82d8d8731e775d7fea" Jan 21 09:26:56 crc kubenswrapper[4618]: I0121 09:26:56.958683 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:26:56 crc kubenswrapper[4618]: I0121 09:26:56.959288 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:27:13 crc kubenswrapper[4618]: I0121 09:27:13.590968 4618 generic.go:334] "Generic (PLEG): container finished" podID="9fab9896-c90d-47af-9a73-4cf53b19d631" containerID="b1df518094348309a98e961e22d0edbdc916a8a52109e4f9748c6437cda6678c" exitCode=0 Jan 21 09:27:13 crc kubenswrapper[4618]: I0121 09:27:13.591063 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" event={"ID":"9fab9896-c90d-47af-9a73-4cf53b19d631","Type":"ContainerDied","Data":"b1df518094348309a98e961e22d0edbdc916a8a52109e4f9748c6437cda6678c"} Jan 21 09:27:14 crc kubenswrapper[4618]: I0121 09:27:14.888229 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.033610 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4385-account-create-update-4xshw"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.039968 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2nfqk"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.046330 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-w5c5w"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.051399 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-48a1-account-create-update-mgr8l"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.056576 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4385-account-create-update-4xshw"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.061931 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2nfqk"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.067006 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-48a1-account-create-update-mgr8l"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.072150 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-w5c5w"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.083516 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4l9h\" (UniqueName: \"kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h\") pod \"9fab9896-c90d-47af-9a73-4cf53b19d631\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.083740 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam\") pod \"9fab9896-c90d-47af-9a73-4cf53b19d631\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.083783 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory\") pod \"9fab9896-c90d-47af-9a73-4cf53b19d631\" (UID: \"9fab9896-c90d-47af-9a73-4cf53b19d631\") " Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.087677 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h" (OuterVolumeSpecName: "kube-api-access-c4l9h") pod "9fab9896-c90d-47af-9a73-4cf53b19d631" (UID: "9fab9896-c90d-47af-9a73-4cf53b19d631"). InnerVolumeSpecName "kube-api-access-c4l9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.103499 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9fab9896-c90d-47af-9a73-4cf53b19d631" (UID: "9fab9896-c90d-47af-9a73-4cf53b19d631"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.103890 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory" (OuterVolumeSpecName: "inventory") pod "9fab9896-c90d-47af-9a73-4cf53b19d631" (UID: "9fab9896-c90d-47af-9a73-4cf53b19d631"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.186271 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.186939 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fab9896-c90d-47af-9a73-4cf53b19d631-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.187003 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4l9h\" (UniqueName: \"kubernetes.io/projected/9fab9896-c90d-47af-9a73-4cf53b19d631-kube-api-access-c4l9h\") on node \"crc\" DevicePath \"\"" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.546915 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2837ccc7-f854-4658-9c44-bc288e4dad4a" path="/var/lib/kubelet/pods/2837ccc7-f854-4658-9c44-bc288e4dad4a/volumes" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.547483 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55c26a4-172d-4d4b-abb9-5acde43e75df" path="/var/lib/kubelet/pods/b55c26a4-172d-4d4b-abb9-5acde43e75df/volumes" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.547976 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be40f1a4-5983-4d88-8cb3-7a923c1f7d45" path="/var/lib/kubelet/pods/be40f1a4-5983-4d88-8cb3-7a923c1f7d45/volumes" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.548485 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8693102-51ce-4b10-8263-8f4e32c29a42" path="/var/lib/kubelet/pods/f8693102-51ce-4b10-8263-8f4e32c29a42/volumes" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.605360 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" event={"ID":"9fab9896-c90d-47af-9a73-4cf53b19d631","Type":"ContainerDied","Data":"94d4ab4e4b6180be484c89b3bd531f50d0dfa082ce9517e834dee884bec30907"} Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.605399 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.605400 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94d4ab4e4b6180be484c89b3bd531f50d0dfa082ce9517e834dee884bec30907" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.699829 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj"] Jan 21 09:27:15 crc kubenswrapper[4618]: E0121 09:27:15.700250 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fab9896-c90d-47af-9a73-4cf53b19d631" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.700268 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fab9896-c90d-47af-9a73-4cf53b19d631" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.700463 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fab9896-c90d-47af-9a73-4cf53b19d631" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.701019 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.703248 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.703678 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.703879 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.711935 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj"] Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.732302 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.898040 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk8lc\" (UniqueName: \"kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.898709 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:15 crc kubenswrapper[4618]: I0121 09:27:15.898828 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.000762 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.000958 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk8lc\" (UniqueName: \"kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.000991 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.004625 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.006328 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.015856 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk8lc\" (UniqueName: \"kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44vvj\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.018132 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.451410 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj"] Jan 21 09:27:16 crc kubenswrapper[4618]: I0121 09:27:16.613651 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" event={"ID":"1b8522ab-9a18-468c-a001-27aa7228e059","Type":"ContainerStarted","Data":"f0b165f45b95415e65cff894ec87e5c21cbff83ecb71d90b3b36f3ff11c656c2"} Jan 21 09:27:17 crc kubenswrapper[4618]: I0121 09:27:17.621531 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" event={"ID":"1b8522ab-9a18-468c-a001-27aa7228e059","Type":"ContainerStarted","Data":"7fd049495e925917e4fbe7765a16dd3a65354ff6a92401e7cd1387f42e8bb94c"} Jan 21 09:27:17 crc kubenswrapper[4618]: I0121 09:27:17.639505 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" podStartSLOduration=2.157154345 podStartE2EDuration="2.639492778s" podCreationTimestamp="2026-01-21 09:27:15 +0000 UTC" firstStartedPulling="2026-01-21 09:27:16.455319511 +0000 UTC m=+1435.205786827" lastFinishedPulling="2026-01-21 09:27:16.937657944 +0000 UTC m=+1435.688125260" observedRunningTime="2026-01-21 09:27:17.632287051 +0000 UTC m=+1436.382754369" watchObservedRunningTime="2026-01-21 09:27:17.639492778 +0000 UTC m=+1436.389960096" Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.027475 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6384-account-create-update-8kht2"] Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.034080 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6384-account-create-update-8kht2"] Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.040219 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fbbjn"] Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.045253 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fbbjn"] Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.558771 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2858edc9-0823-41b9-9c3a-ca8eecb450fa" path="/var/lib/kubelet/pods/2858edc9-0823-41b9-9c3a-ca8eecb450fa/volumes" Jan 21 09:27:21 crc kubenswrapper[4618]: I0121 09:27:21.559613 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76cf94b1-8904-4389-8ef3-8dd36ea02ecf" path="/var/lib/kubelet/pods/76cf94b1-8904-4389-8ef3-8dd36ea02ecf/volumes" Jan 21 09:27:26 crc kubenswrapper[4618]: I0121 09:27:26.958814 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:27:26 crc kubenswrapper[4618]: I0121 09:27:26.959437 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.563026 4618 scope.go:117] "RemoveContainer" containerID="3e270b57684dde95f75300c11054eb3dbc3ada609e7b71563c8bc1ff3cbc7551" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.591222 4618 scope.go:117] "RemoveContainer" containerID="447b5e82b2477dfc53d662a416596a09b3a32b25f5eef55f5f764aeb64a9b5ca" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.621884 4618 scope.go:117] "RemoveContainer" containerID="9063f83d107d6f66f2d6174199ebe5f0ac4c28046a82e5d5e456a1086dbf455b" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.650856 4618 scope.go:117] "RemoveContainer" containerID="dbdb817b025e2510ae1d191a4f58054e9e8287c91183dd592861121055eaf439" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.685444 4618 scope.go:117] "RemoveContainer" containerID="af9e0729408e82eec4aca27f3402390aaed3ef37c6d7a2df988bc8348db2c012" Jan 21 09:27:27 crc kubenswrapper[4618]: I0121 09:27:27.717111 4618 scope.go:117] "RemoveContainer" containerID="5bcf469c17fd3ad450902872d022584f8ffdc336d641778d53260b97a1733d4e" Jan 21 09:27:36 crc kubenswrapper[4618]: I0121 09:27:36.022528 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n4b6j"] Jan 21 09:27:36 crc kubenswrapper[4618]: I0121 09:27:36.049052 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n4b6j"] Jan 21 09:27:37 crc kubenswrapper[4618]: I0121 09:27:37.547501 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a90b60-c5b7-409e-b6cd-53e7c1a0006e" path="/var/lib/kubelet/pods/38a90b60-c5b7-409e-b6cd-53e7c1a0006e/volumes" Jan 21 09:27:39 crc kubenswrapper[4618]: I0121 09:27:39.027602 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wmsxn"] Jan 21 09:27:39 crc kubenswrapper[4618]: I0121 09:27:39.035322 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wmsxn"] Jan 21 09:27:39 crc kubenswrapper[4618]: I0121 09:27:39.546433 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb18034-16fd-4cfc-8748-2c58b8584346" path="/var/lib/kubelet/pods/0fb18034-16fd-4cfc-8748-2c58b8584346/volumes" Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.039264 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mglts"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.045791 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-efa7-account-create-update-btb2b"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.052339 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-dfce-account-create-update-l66gf"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.059021 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-p4qcp"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.064378 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mglts"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.069376 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-dfce-account-create-update-l66gf"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.074361 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-efa7-account-create-update-btb2b"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.079101 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-p4qcp"] Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.549865 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e120b8a-c3f7-41ec-8987-a8b74198bc74" path="/var/lib/kubelet/pods/1e120b8a-c3f7-41ec-8987-a8b74198bc74/volumes" Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.550495 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2835ff70-c16c-42dd-8971-f3cfd0ae800f" path="/var/lib/kubelet/pods/2835ff70-c16c-42dd-8971-f3cfd0ae800f/volumes" Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.551006 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ff954b-927c-4066-b75d-0e0f4530f888" path="/var/lib/kubelet/pods/a0ff954b-927c-4066-b75d-0e0f4530f888/volumes" Jan 21 09:27:53 crc kubenswrapper[4618]: I0121 09:27:53.551521 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3cd0644-db42-42d1-884b-8c341d4eb1c9" path="/var/lib/kubelet/pods/e3cd0644-db42-42d1-884b-8c341d4eb1c9/volumes" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.323674 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.327009 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.333900 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.397109 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.397198 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97lf\" (UniqueName: \"kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.397250 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.500091 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97lf\" (UniqueName: \"kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.500231 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.500341 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.500905 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.501071 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.520411 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97lf\" (UniqueName: \"kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf\") pod \"redhat-marketplace-cxm67\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.643457 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.958824 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.959483 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.959540 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.960713 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:27:56 crc kubenswrapper[4618]: I0121 09:27:56.960783 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6" gracePeriod=600 Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.031839 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-w578p"] Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.038542 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-00df-account-create-update-d7dj6"] Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.045249 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-00df-account-create-update-d7dj6"] Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.050029 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-w578p"] Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.062918 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.551296 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9359c136-cac8-46b2-b341-b893187d9476" path="/var/lib/kubelet/pods/9359c136-cac8-46b2-b341-b893187d9476/volumes" Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.552592 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37791f6-6d08-4964-9ac8-c9d99f04979c" path="/var/lib/kubelet/pods/a37791f6-6d08-4964-9ac8-c9d99f04979c/volumes" Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.929639 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6" exitCode=0 Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.929721 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6"} Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.929798 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c"} Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.929832 4618 scope.go:117] "RemoveContainer" containerID="1b82cb5509cc8c02fc5f5f4117ccf9ae4b6b90d3ab5a9f956d5d54bd8357ac4b" Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.933255 4618 generic.go:334] "Generic (PLEG): container finished" podID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerID="e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c" exitCode=0 Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.933428 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerDied","Data":"e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c"} Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.933600 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerStarted","Data":"3bf5599da37ec11e53917ce4012458f1d0c1bcdac471ab17e064ec69facb2e6e"} Jan 21 09:27:57 crc kubenswrapper[4618]: I0121 09:27:57.935870 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:27:58 crc kubenswrapper[4618]: I0121 09:27:58.943277 4618 generic.go:334] "Generic (PLEG): container finished" podID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerID="06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb" exitCode=0 Jan 21 09:27:58 crc kubenswrapper[4618]: I0121 09:27:58.943373 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerDied","Data":"06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb"} Jan 21 09:27:59 crc kubenswrapper[4618]: I0121 09:27:59.960493 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerStarted","Data":"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044"} Jan 21 09:27:59 crc kubenswrapper[4618]: I0121 09:27:59.977122 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxm67" podStartSLOduration=2.3983863469999998 podStartE2EDuration="3.977098397s" podCreationTimestamp="2026-01-21 09:27:56 +0000 UTC" firstStartedPulling="2026-01-21 09:27:57.935535852 +0000 UTC m=+1476.686003169" lastFinishedPulling="2026-01-21 09:27:59.514247901 +0000 UTC m=+1478.264715219" observedRunningTime="2026-01-21 09:27:59.973597916 +0000 UTC m=+1478.724065234" watchObservedRunningTime="2026-01-21 09:27:59.977098397 +0000 UTC m=+1478.727565715" Jan 21 09:28:00 crc kubenswrapper[4618]: I0121 09:28:00.025749 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mdmw7"] Jan 21 09:28:00 crc kubenswrapper[4618]: I0121 09:28:00.031350 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mdmw7"] Jan 21 09:28:01 crc kubenswrapper[4618]: I0121 09:28:01.563548 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e651bca2-bb3f-4946-a656-8900b6c25427" path="/var/lib/kubelet/pods/e651bca2-bb3f-4946-a656-8900b6c25427/volumes" Jan 21 09:28:06 crc kubenswrapper[4618]: I0121 09:28:06.644527 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:06 crc kubenswrapper[4618]: I0121 09:28:06.645079 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:06 crc kubenswrapper[4618]: I0121 09:28:06.686562 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:07 crc kubenswrapper[4618]: I0121 09:28:07.045793 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:07 crc kubenswrapper[4618]: I0121 09:28:07.089682 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.026598 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxm67" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="registry-server" containerID="cri-o://8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044" gracePeriod=2 Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.386302 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.552212 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d97lf\" (UniqueName: \"kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf\") pod \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.552304 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content\") pod \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.552334 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities\") pod \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\" (UID: \"994be938-45c7-4b38-8ec0-9cb5adb3d2f5\") " Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.553862 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities" (OuterVolumeSpecName: "utilities") pod "994be938-45c7-4b38-8ec0-9cb5adb3d2f5" (UID: "994be938-45c7-4b38-8ec0-9cb5adb3d2f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.557502 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf" (OuterVolumeSpecName: "kube-api-access-d97lf") pod "994be938-45c7-4b38-8ec0-9cb5adb3d2f5" (UID: "994be938-45c7-4b38-8ec0-9cb5adb3d2f5"). InnerVolumeSpecName "kube-api-access-d97lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.569446 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "994be938-45c7-4b38-8ec0-9cb5adb3d2f5" (UID: "994be938-45c7-4b38-8ec0-9cb5adb3d2f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.654706 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d97lf\" (UniqueName: \"kubernetes.io/projected/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-kube-api-access-d97lf\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.654743 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:09 crc kubenswrapper[4618]: I0121 09:28:09.654754 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/994be938-45c7-4b38-8ec0-9cb5adb3d2f5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.035127 4618 generic.go:334] "Generic (PLEG): container finished" podID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerID="8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044" exitCode=0 Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.035204 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerDied","Data":"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044"} Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.035236 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxm67" event={"ID":"994be938-45c7-4b38-8ec0-9cb5adb3d2f5","Type":"ContainerDied","Data":"3bf5599da37ec11e53917ce4012458f1d0c1bcdac471ab17e064ec69facb2e6e"} Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.035254 4618 scope.go:117] "RemoveContainer" containerID="8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.035405 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxm67" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.051507 4618 scope.go:117] "RemoveContainer" containerID="06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.071002 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.077681 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxm67"] Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.087887 4618 scope.go:117] "RemoveContainer" containerID="e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.112100 4618 scope.go:117] "RemoveContainer" containerID="8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044" Jan 21 09:28:10 crc kubenswrapper[4618]: E0121 09:28:10.112525 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044\": container with ID starting with 8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044 not found: ID does not exist" containerID="8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.112563 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044"} err="failed to get container status \"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044\": rpc error: code = NotFound desc = could not find container \"8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044\": container with ID starting with 8bbfde40af22e1884123cd29a5f9a1905e4f6ef22c028d9c527c9aec63285044 not found: ID does not exist" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.112588 4618 scope.go:117] "RemoveContainer" containerID="06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb" Jan 21 09:28:10 crc kubenswrapper[4618]: E0121 09:28:10.112894 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb\": container with ID starting with 06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb not found: ID does not exist" containerID="06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.112928 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb"} err="failed to get container status \"06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb\": rpc error: code = NotFound desc = could not find container \"06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb\": container with ID starting with 06690fbdd6e3414ef25c1d15e706b7c4c88a59a661847dc3896cd756910c1edb not found: ID does not exist" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.112944 4618 scope.go:117] "RemoveContainer" containerID="e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c" Jan 21 09:28:10 crc kubenswrapper[4618]: E0121 09:28:10.113186 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c\": container with ID starting with e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c not found: ID does not exist" containerID="e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c" Jan 21 09:28:10 crc kubenswrapper[4618]: I0121 09:28:10.113217 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c"} err="failed to get container status \"e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c\": rpc error: code = NotFound desc = could not find container \"e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c\": container with ID starting with e3240d237f75a42bcd9afba7c7404441eb559c9cdff1f84099d57dda1e48194c not found: ID does not exist" Jan 21 09:28:11 crc kubenswrapper[4618]: I0121 09:28:11.548517 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" path="/var/lib/kubelet/pods/994be938-45c7-4b38-8ec0-9cb5adb3d2f5/volumes" Jan 21 09:28:16 crc kubenswrapper[4618]: I0121 09:28:16.088232 4618 generic.go:334] "Generic (PLEG): container finished" podID="1b8522ab-9a18-468c-a001-27aa7228e059" containerID="7fd049495e925917e4fbe7765a16dd3a65354ff6a92401e7cd1387f42e8bb94c" exitCode=0 Jan 21 09:28:16 crc kubenswrapper[4618]: I0121 09:28:16.088313 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" event={"ID":"1b8522ab-9a18-468c-a001-27aa7228e059","Type":"ContainerDied","Data":"7fd049495e925917e4fbe7765a16dd3a65354ff6a92401e7cd1387f42e8bb94c"} Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.431524 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.599532 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam\") pod \"1b8522ab-9a18-468c-a001-27aa7228e059\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.599779 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory\") pod \"1b8522ab-9a18-468c-a001-27aa7228e059\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.599839 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk8lc\" (UniqueName: \"kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc\") pod \"1b8522ab-9a18-468c-a001-27aa7228e059\" (UID: \"1b8522ab-9a18-468c-a001-27aa7228e059\") " Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.617608 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc" (OuterVolumeSpecName: "kube-api-access-dk8lc") pod "1b8522ab-9a18-468c-a001-27aa7228e059" (UID: "1b8522ab-9a18-468c-a001-27aa7228e059"). InnerVolumeSpecName "kube-api-access-dk8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.623471 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory" (OuterVolumeSpecName: "inventory") pod "1b8522ab-9a18-468c-a001-27aa7228e059" (UID: "1b8522ab-9a18-468c-a001-27aa7228e059"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.623788 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1b8522ab-9a18-468c-a001-27aa7228e059" (UID: "1b8522ab-9a18-468c-a001-27aa7228e059"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.702121 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.702259 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk8lc\" (UniqueName: \"kubernetes.io/projected/1b8522ab-9a18-468c-a001-27aa7228e059-kube-api-access-dk8lc\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:17 crc kubenswrapper[4618]: I0121 09:28:17.702331 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b8522ab-9a18-468c-a001-27aa7228e059-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.032130 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jqn8x"] Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.038101 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jqn8x"] Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.105779 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" event={"ID":"1b8522ab-9a18-468c-a001-27aa7228e059","Type":"ContainerDied","Data":"f0b165f45b95415e65cff894ec87e5c21cbff83ecb71d90b3b36f3ff11c656c2"} Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.105819 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b165f45b95415e65cff894ec87e5c21cbff83ecb71d90b3b36f3ff11c656c2" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.105873 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44vvj" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.175581 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh"] Jan 21 09:28:18 crc kubenswrapper[4618]: E0121 09:28:18.176174 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="registry-server" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176196 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="registry-server" Jan 21 09:28:18 crc kubenswrapper[4618]: E0121 09:28:18.176214 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8522ab-9a18-468c-a001-27aa7228e059" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176222 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8522ab-9a18-468c-a001-27aa7228e059" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:18 crc kubenswrapper[4618]: E0121 09:28:18.176245 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="extract-content" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176251 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="extract-content" Jan 21 09:28:18 crc kubenswrapper[4618]: E0121 09:28:18.176272 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="extract-utilities" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176278 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="extract-utilities" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176478 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8522ab-9a18-468c-a001-27aa7228e059" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.176501 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="994be938-45c7-4b38-8ec0-9cb5adb3d2f5" containerName="registry-server" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.177340 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.180075 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.180180 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.180326 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.181118 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.183569 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh"] Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.318908 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46s4\" (UniqueName: \"kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.318999 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.319087 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.419778 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46s4\" (UniqueName: \"kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.419831 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.419870 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.423918 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.423943 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.433638 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46s4\" (UniqueName: \"kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.493115 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:18 crc kubenswrapper[4618]: I0121 09:28:18.942427 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh"] Jan 21 09:28:19 crc kubenswrapper[4618]: I0121 09:28:19.116290 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" event={"ID":"05574b5d-bd37-4837-a247-9f1f5bb09d09","Type":"ContainerStarted","Data":"1e8609d47057b54e6aa71bcad63e53ba70fcab16f0d55f80b5caae1f881e840d"} Jan 21 09:28:19 crc kubenswrapper[4618]: I0121 09:28:19.548451 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0f8544-4e4e-49e4-8eff-43529c9e607b" path="/var/lib/kubelet/pods/1f0f8544-4e4e-49e4-8eff-43529c9e607b/volumes" Jan 21 09:28:20 crc kubenswrapper[4618]: I0121 09:28:20.125509 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" event={"ID":"05574b5d-bd37-4837-a247-9f1f5bb09d09","Type":"ContainerStarted","Data":"5c4df2581209e67f775e424a5537ecedac4b0eb349a6dd89cfbdb5ee5795989e"} Jan 21 09:28:20 crc kubenswrapper[4618]: I0121 09:28:20.141705 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" podStartSLOduration=1.663190067 podStartE2EDuration="2.1416872s" podCreationTimestamp="2026-01-21 09:28:18 +0000 UTC" firstStartedPulling="2026-01-21 09:28:18.945325317 +0000 UTC m=+1497.695792634" lastFinishedPulling="2026-01-21 09:28:19.423822449 +0000 UTC m=+1498.174289767" observedRunningTime="2026-01-21 09:28:20.137051345 +0000 UTC m=+1498.887518662" watchObservedRunningTime="2026-01-21 09:28:20.1416872 +0000 UTC m=+1498.892154517" Jan 21 09:28:23 crc kubenswrapper[4618]: I0121 09:28:23.154164 4618 generic.go:334] "Generic (PLEG): container finished" podID="05574b5d-bd37-4837-a247-9f1f5bb09d09" containerID="5c4df2581209e67f775e424a5537ecedac4b0eb349a6dd89cfbdb5ee5795989e" exitCode=0 Jan 21 09:28:23 crc kubenswrapper[4618]: I0121 09:28:23.154245 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" event={"ID":"05574b5d-bd37-4837-a247-9f1f5bb09d09","Type":"ContainerDied","Data":"5c4df2581209e67f775e424a5537ecedac4b0eb349a6dd89cfbdb5ee5795989e"} Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.514324 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.651479 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam\") pod \"05574b5d-bd37-4837-a247-9f1f5bb09d09\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.651573 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h46s4\" (UniqueName: \"kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4\") pod \"05574b5d-bd37-4837-a247-9f1f5bb09d09\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.651671 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory\") pod \"05574b5d-bd37-4837-a247-9f1f5bb09d09\" (UID: \"05574b5d-bd37-4837-a247-9f1f5bb09d09\") " Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.657843 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4" (OuterVolumeSpecName: "kube-api-access-h46s4") pod "05574b5d-bd37-4837-a247-9f1f5bb09d09" (UID: "05574b5d-bd37-4837-a247-9f1f5bb09d09"). InnerVolumeSpecName "kube-api-access-h46s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.682261 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory" (OuterVolumeSpecName: "inventory") pod "05574b5d-bd37-4837-a247-9f1f5bb09d09" (UID: "05574b5d-bd37-4837-a247-9f1f5bb09d09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.682416 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05574b5d-bd37-4837-a247-9f1f5bb09d09" (UID: "05574b5d-bd37-4837-a247-9f1f5bb09d09"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.754688 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.754713 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05574b5d-bd37-4837-a247-9f1f5bb09d09-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:24 crc kubenswrapper[4618]: I0121 09:28:24.754725 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h46s4\" (UniqueName: \"kubernetes.io/projected/05574b5d-bd37-4837-a247-9f1f5bb09d09-kube-api-access-h46s4\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.177649 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" event={"ID":"05574b5d-bd37-4837-a247-9f1f5bb09d09","Type":"ContainerDied","Data":"1e8609d47057b54e6aa71bcad63e53ba70fcab16f0d55f80b5caae1f881e840d"} Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.177702 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8609d47057b54e6aa71bcad63e53ba70fcab16f0d55f80b5caae1f881e840d" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.177758 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.240437 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2"] Jan 21 09:28:25 crc kubenswrapper[4618]: E0121 09:28:25.241309 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05574b5d-bd37-4837-a247-9f1f5bb09d09" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.241335 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="05574b5d-bd37-4837-a247-9f1f5bb09d09" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.241847 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="05574b5d-bd37-4837-a247-9f1f5bb09d09" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.243054 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.247272 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.247415 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.247547 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.247680 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.263832 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2"] Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.367354 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.367415 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.367449 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8n4j\" (UniqueName: \"kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.470364 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.470413 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.471010 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8n4j\" (UniqueName: \"kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.474871 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.476127 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.484719 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8n4j\" (UniqueName: \"kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4b7z2\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:25 crc kubenswrapper[4618]: I0121 09:28:25.569006 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:26 crc kubenswrapper[4618]: I0121 09:28:26.029419 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2"] Jan 21 09:28:26 crc kubenswrapper[4618]: I0121 09:28:26.186736 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" event={"ID":"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde","Type":"ContainerStarted","Data":"11f0637610ea83487c8f1718dc58ccf6a178999c47bdd0384dc0bedf71ef3af6"} Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.195218 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" event={"ID":"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde","Type":"ContainerStarted","Data":"dca3f0878ea32d8476ad97a9e4f96faaa0a2b473f0b5a4331f46d04bd688f0a3"} Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.215834 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" podStartSLOduration=1.661034464 podStartE2EDuration="2.215816891s" podCreationTimestamp="2026-01-21 09:28:25 +0000 UTC" firstStartedPulling="2026-01-21 09:28:26.031012976 +0000 UTC m=+1504.781480294" lastFinishedPulling="2026-01-21 09:28:26.585795404 +0000 UTC m=+1505.336262721" observedRunningTime="2026-01-21 09:28:27.206867536 +0000 UTC m=+1505.957334853" watchObservedRunningTime="2026-01-21 09:28:27.215816891 +0000 UTC m=+1505.966284208" Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.837650 4618 scope.go:117] "RemoveContainer" containerID="42ab8ca3f7c184a911cc0c2f7df404d8872facf10b1646eb88069c91f79c5f48" Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.876195 4618 scope.go:117] "RemoveContainer" containerID="58bc2e013ab44651fb62f3d974e744d7e6980163e2de14a29cf610864a23d4cd" Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.892603 4618 scope.go:117] "RemoveContainer" containerID="010b11d67fefde8e399456132c1f94a50b6b261395e9da406debcaf9e7b310e2" Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.924756 4618 scope.go:117] "RemoveContainer" containerID="ede8bfe8c1f35eda989d6db1469c3ab0c187fbfcdfac2df094785949c48196e3" Jan 21 09:28:27 crc kubenswrapper[4618]: I0121 09:28:27.972816 4618 scope.go:117] "RemoveContainer" containerID="8b76966ef840de994e421729c0767fcf908c1cf8cd0b678fdea4b8d8971b5fb4" Jan 21 09:28:28 crc kubenswrapper[4618]: I0121 09:28:28.007377 4618 scope.go:117] "RemoveContainer" containerID="0d433ae38db2df7bbdad947799db2a7da6290b91bedcf10ead42729f9839d023" Jan 21 09:28:28 crc kubenswrapper[4618]: I0121 09:28:28.023034 4618 scope.go:117] "RemoveContainer" containerID="b0865635365d32c9c1fe6743f6d8d6b1f3944c3e71f87627a5b0e6f650bd393a" Jan 21 09:28:28 crc kubenswrapper[4618]: I0121 09:28:28.053110 4618 scope.go:117] "RemoveContainer" containerID="ff76ee5bc7affd2950b673c6ee148d97c0543b6ac93970aedab3ced0c91a3005" Jan 21 09:28:28 crc kubenswrapper[4618]: I0121 09:28:28.069245 4618 scope.go:117] "RemoveContainer" containerID="351aa67e75091cda85e1f589ab12fe5602e0ddd1793e747e581af3b2c01baee2" Jan 21 09:28:28 crc kubenswrapper[4618]: I0121 09:28:28.092504 4618 scope.go:117] "RemoveContainer" containerID="e94ac90fbc6029d8088933b785aa87f1603f41efe025bde503fa963ec1a850df" Jan 21 09:28:38 crc kubenswrapper[4618]: I0121 09:28:38.080194 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l5lv6"] Jan 21 09:28:38 crc kubenswrapper[4618]: I0121 09:28:38.142055 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-q2tdw"] Jan 21 09:28:38 crc kubenswrapper[4618]: I0121 09:28:38.151042 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l5lv6"] Jan 21 09:28:38 crc kubenswrapper[4618]: I0121 09:28:38.160546 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-q2tdw"] Jan 21 09:28:39 crc kubenswrapper[4618]: I0121 09:28:39.545604 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12353eaa-fb43-415a-b590-f69fadbdd4e1" path="/var/lib/kubelet/pods/12353eaa-fb43-415a-b590-f69fadbdd4e1/volumes" Jan 21 09:28:39 crc kubenswrapper[4618]: I0121 09:28:39.546398 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d13246f-0095-4316-9769-2173765b9ae6" path="/var/lib/kubelet/pods/8d13246f-0095-4316-9769-2173765b9ae6/volumes" Jan 21 09:28:42 crc kubenswrapper[4618]: I0121 09:28:42.040976 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wgbvc"] Jan 21 09:28:42 crc kubenswrapper[4618]: I0121 09:28:42.050202 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wgbvc"] Jan 21 09:28:43 crc kubenswrapper[4618]: I0121 09:28:43.546502 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f724147-bec3-4df8-8f5d-cb9ff9e128e0" path="/var/lib/kubelet/pods/1f724147-bec3-4df8-8f5d-cb9ff9e128e0/volumes" Jan 21 09:28:48 crc kubenswrapper[4618]: I0121 09:28:48.031724 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vs7mz"] Jan 21 09:28:48 crc kubenswrapper[4618]: I0121 09:28:48.036876 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vs7mz"] Jan 21 09:28:49 crc kubenswrapper[4618]: I0121 09:28:49.545709 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a5bcfd-6e6c-4070-b7e2-b2e90789f888" path="/var/lib/kubelet/pods/46a5bcfd-6e6c-4070-b7e2-b2e90789f888/volumes" Jan 21 09:28:52 crc kubenswrapper[4618]: I0121 09:28:52.404017 4618 generic.go:334] "Generic (PLEG): container finished" podID="3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" containerID="dca3f0878ea32d8476ad97a9e4f96faaa0a2b473f0b5a4331f46d04bd688f0a3" exitCode=0 Jan 21 09:28:52 crc kubenswrapper[4618]: I0121 09:28:52.404083 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" event={"ID":"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde","Type":"ContainerDied","Data":"dca3f0878ea32d8476ad97a9e4f96faaa0a2b473f0b5a4331f46d04bd688f0a3"} Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.795303 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.942566 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8n4j\" (UniqueName: \"kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j\") pod \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.942796 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory\") pod \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.942921 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam\") pod \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\" (UID: \"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde\") " Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.948983 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j" (OuterVolumeSpecName: "kube-api-access-f8n4j") pod "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" (UID: "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde"). InnerVolumeSpecName "kube-api-access-f8n4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.966947 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory" (OuterVolumeSpecName: "inventory") pod "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" (UID: "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:53 crc kubenswrapper[4618]: I0121 09:28:53.967506 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" (UID: "3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.045352 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.045382 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.045395 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8n4j\" (UniqueName: \"kubernetes.io/projected/3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde-kube-api-access-f8n4j\") on node \"crc\" DevicePath \"\"" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.419741 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" event={"ID":"3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde","Type":"ContainerDied","Data":"11f0637610ea83487c8f1718dc58ccf6a178999c47bdd0384dc0bedf71ef3af6"} Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.419782 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11f0637610ea83487c8f1718dc58ccf6a178999c47bdd0384dc0bedf71ef3af6" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.419783 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4b7z2" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.501579 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9"] Jan 21 09:28:54 crc kubenswrapper[4618]: E0121 09:28:54.502001 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.502022 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.502277 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.502867 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.507774 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.507838 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.508136 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.509441 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.515683 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9"] Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.655596 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.655715 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgz99\" (UniqueName: \"kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.656972 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.759093 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.759271 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.759304 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgz99\" (UniqueName: \"kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.763921 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.764297 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.778097 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgz99\" (UniqueName: \"kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:54 crc kubenswrapper[4618]: I0121 09:28:54.816434 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:28:55 crc kubenswrapper[4618]: I0121 09:28:55.301625 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9"] Jan 21 09:28:55 crc kubenswrapper[4618]: I0121 09:28:55.431663 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" event={"ID":"3d61b58a-5231-47ee-8d01-2eb51a1def0c","Type":"ContainerStarted","Data":"12d4293f2859a39008798cb08ac038f1b16e8e8bfde8e95e318eaaff3c3186ed"} Jan 21 09:28:56 crc kubenswrapper[4618]: I0121 09:28:56.439378 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" event={"ID":"3d61b58a-5231-47ee-8d01-2eb51a1def0c","Type":"ContainerStarted","Data":"78b962d9d6ba9d9132a18b40e4105059e34fbdc4a957c02be3fb979601ea62be"} Jan 21 09:28:56 crc kubenswrapper[4618]: I0121 09:28:56.453214 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" podStartSLOduration=1.956565272 podStartE2EDuration="2.453194201s" podCreationTimestamp="2026-01-21 09:28:54 +0000 UTC" firstStartedPulling="2026-01-21 09:28:55.305773718 +0000 UTC m=+1534.056241035" lastFinishedPulling="2026-01-21 09:28:55.802402647 +0000 UTC m=+1534.552869964" observedRunningTime="2026-01-21 09:28:56.452283018 +0000 UTC m=+1535.202750335" watchObservedRunningTime="2026-01-21 09:28:56.453194201 +0000 UTC m=+1535.203661518" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.785224 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.787856 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.795423 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.828019 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2ch\" (UniqueName: \"kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.828094 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.828175 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.930770 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.931173 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2ch\" (UniqueName: \"kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.931408 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.931922 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.932535 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:07 crc kubenswrapper[4618]: I0121 09:29:07.952862 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2ch\" (UniqueName: \"kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch\") pod \"redhat-operators-fh5rd\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:08 crc kubenswrapper[4618]: I0121 09:29:08.112475 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:08 crc kubenswrapper[4618]: I0121 09:29:08.509061 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:08 crc kubenswrapper[4618]: I0121 09:29:08.538537 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerStarted","Data":"388645150b6d1d02f691b0a93f43adfdc8013e6f4a7714dac5bee85604f4fefe"} Jan 21 09:29:09 crc kubenswrapper[4618]: I0121 09:29:09.549965 4618 generic.go:334] "Generic (PLEG): container finished" podID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerID="b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275" exitCode=0 Jan 21 09:29:09 crc kubenswrapper[4618]: I0121 09:29:09.550027 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerDied","Data":"b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275"} Jan 21 09:29:11 crc kubenswrapper[4618]: I0121 09:29:11.572042 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerStarted","Data":"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57"} Jan 21 09:29:12 crc kubenswrapper[4618]: I0121 09:29:12.588902 4618 generic.go:334] "Generic (PLEG): container finished" podID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerID="3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57" exitCode=0 Jan 21 09:29:12 crc kubenswrapper[4618]: I0121 09:29:12.588966 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerDied","Data":"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57"} Jan 21 09:29:13 crc kubenswrapper[4618]: I0121 09:29:13.598224 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerStarted","Data":"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4"} Jan 21 09:29:13 crc kubenswrapper[4618]: I0121 09:29:13.619288 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fh5rd" podStartSLOduration=3.022471106 podStartE2EDuration="6.619274055s" podCreationTimestamp="2026-01-21 09:29:07 +0000 UTC" firstStartedPulling="2026-01-21 09:29:09.551901751 +0000 UTC m=+1548.302369067" lastFinishedPulling="2026-01-21 09:29:13.148704698 +0000 UTC m=+1551.899172016" observedRunningTime="2026-01-21 09:29:13.611100159 +0000 UTC m=+1552.361567476" watchObservedRunningTime="2026-01-21 09:29:13.619274055 +0000 UTC m=+1552.369741363" Jan 21 09:29:18 crc kubenswrapper[4618]: I0121 09:29:18.113522 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:18 crc kubenswrapper[4618]: I0121 09:29:18.114048 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:19 crc kubenswrapper[4618]: I0121 09:29:19.151463 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fh5rd" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="registry-server" probeResult="failure" output=< Jan 21 09:29:19 crc kubenswrapper[4618]: timeout: failed to connect service ":50051" within 1s Jan 21 09:29:19 crc kubenswrapper[4618]: > Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.144251 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.186833 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.276187 4618 scope.go:117] "RemoveContainer" containerID="87a40136d868c2824f5fc5bb46b743424c7a3dbf9f6bc49e200f2795c790d5dc" Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.300750 4618 scope.go:117] "RemoveContainer" containerID="31e4d95e7e1ccdff0b91bc94a4b7eb6d8ccb2a874e1197bcf9ba5a422bfbe3b1" Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.349447 4618 scope.go:117] "RemoveContainer" containerID="a398c0974e3acf51dc424a7d06423325756541e8a3ba87270615b3e91608837f" Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.380633 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:28 crc kubenswrapper[4618]: I0121 09:29:28.401863 4618 scope.go:117] "RemoveContainer" containerID="b4903311de2e0d5517b3770d98e9bfaad26bd69150b6b3dd9d488ddf1d2a52df" Jan 21 09:29:29 crc kubenswrapper[4618]: I0121 09:29:29.749935 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fh5rd" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="registry-server" containerID="cri-o://4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4" gracePeriod=2 Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.160652 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.322400 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content\") pod \"7df0b108-558d-436b-8834-de4e5fbbdd68\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.322494 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh2ch\" (UniqueName: \"kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch\") pod \"7df0b108-558d-436b-8834-de4e5fbbdd68\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.322689 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities\") pod \"7df0b108-558d-436b-8834-de4e5fbbdd68\" (UID: \"7df0b108-558d-436b-8834-de4e5fbbdd68\") " Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.324186 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities" (OuterVolumeSpecName: "utilities") pod "7df0b108-558d-436b-8834-de4e5fbbdd68" (UID: "7df0b108-558d-436b-8834-de4e5fbbdd68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.345878 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch" (OuterVolumeSpecName: "kube-api-access-sh2ch") pod "7df0b108-558d-436b-8834-de4e5fbbdd68" (UID: "7df0b108-558d-436b-8834-de4e5fbbdd68"). InnerVolumeSpecName "kube-api-access-sh2ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.425295 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.425323 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh2ch\" (UniqueName: \"kubernetes.io/projected/7df0b108-558d-436b-8834-de4e5fbbdd68-kube-api-access-sh2ch\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.441582 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7df0b108-558d-436b-8834-de4e5fbbdd68" (UID: "7df0b108-558d-436b-8834-de4e5fbbdd68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.526917 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df0b108-558d-436b-8834-de4e5fbbdd68-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.764219 4618 generic.go:334] "Generic (PLEG): container finished" podID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerID="4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4" exitCode=0 Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.764291 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerDied","Data":"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4"} Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.764323 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fh5rd" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.764717 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fh5rd" event={"ID":"7df0b108-558d-436b-8834-de4e5fbbdd68","Type":"ContainerDied","Data":"388645150b6d1d02f691b0a93f43adfdc8013e6f4a7714dac5bee85604f4fefe"} Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.764766 4618 scope.go:117] "RemoveContainer" containerID="4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.767707 4618 generic.go:334] "Generic (PLEG): container finished" podID="3d61b58a-5231-47ee-8d01-2eb51a1def0c" containerID="78b962d9d6ba9d9132a18b40e4105059e34fbdc4a957c02be3fb979601ea62be" exitCode=0 Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.767736 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" event={"ID":"3d61b58a-5231-47ee-8d01-2eb51a1def0c","Type":"ContainerDied","Data":"78b962d9d6ba9d9132a18b40e4105059e34fbdc4a957c02be3fb979601ea62be"} Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.789282 4618 scope.go:117] "RemoveContainer" containerID="3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.808509 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.810905 4618 scope.go:117] "RemoveContainer" containerID="b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.816754 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fh5rd"] Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.836133 4618 scope.go:117] "RemoveContainer" containerID="4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4" Jan 21 09:29:30 crc kubenswrapper[4618]: E0121 09:29:30.836504 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4\": container with ID starting with 4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4 not found: ID does not exist" containerID="4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.836546 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4"} err="failed to get container status \"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4\": rpc error: code = NotFound desc = could not find container \"4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4\": container with ID starting with 4d0206b4987a6875abcc0fe434b94c98ab0abb658aeb70bbe4a8238d5f413ca4 not found: ID does not exist" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.836574 4618 scope.go:117] "RemoveContainer" containerID="3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57" Jan 21 09:29:30 crc kubenswrapper[4618]: E0121 09:29:30.836986 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57\": container with ID starting with 3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57 not found: ID does not exist" containerID="3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.837007 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57"} err="failed to get container status \"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57\": rpc error: code = NotFound desc = could not find container \"3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57\": container with ID starting with 3a301562349a84fdb3a96ff7fdef10b8e167962d2f2dcbe548dccf19884e1a57 not found: ID does not exist" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.837020 4618 scope.go:117] "RemoveContainer" containerID="b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275" Jan 21 09:29:30 crc kubenswrapper[4618]: E0121 09:29:30.837402 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275\": container with ID starting with b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275 not found: ID does not exist" containerID="b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275" Jan 21 09:29:30 crc kubenswrapper[4618]: I0121 09:29:30.837451 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275"} err="failed to get container status \"b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275\": rpc error: code = NotFound desc = could not find container \"b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275\": container with ID starting with b4c9c1eb80753d92029790bed0386ed8c3884e6529f8b70c7609af9535392275 not found: ID does not exist" Jan 21 09:29:31 crc kubenswrapper[4618]: I0121 09:29:31.546829 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" path="/var/lib/kubelet/pods/7df0b108-558d-436b-8834-de4e5fbbdd68/volumes" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.157119 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.166162 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgz99\" (UniqueName: \"kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99\") pod \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.166246 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory\") pod \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.178864 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99" (OuterVolumeSpecName: "kube-api-access-fgz99") pod "3d61b58a-5231-47ee-8d01-2eb51a1def0c" (UID: "3d61b58a-5231-47ee-8d01-2eb51a1def0c"). InnerVolumeSpecName "kube-api-access-fgz99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.197654 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory" (OuterVolumeSpecName: "inventory") pod "3d61b58a-5231-47ee-8d01-2eb51a1def0c" (UID: "3d61b58a-5231-47ee-8d01-2eb51a1def0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.267808 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam\") pod \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\" (UID: \"3d61b58a-5231-47ee-8d01-2eb51a1def0c\") " Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.268337 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.268359 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgz99\" (UniqueName: \"kubernetes.io/projected/3d61b58a-5231-47ee-8d01-2eb51a1def0c-kube-api-access-fgz99\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.289789 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d61b58a-5231-47ee-8d01-2eb51a1def0c" (UID: "3d61b58a-5231-47ee-8d01-2eb51a1def0c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.370376 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d61b58a-5231-47ee-8d01-2eb51a1def0c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.792659 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" event={"ID":"3d61b58a-5231-47ee-8d01-2eb51a1def0c","Type":"ContainerDied","Data":"12d4293f2859a39008798cb08ac038f1b16e8e8bfde8e95e318eaaff3c3186ed"} Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.792736 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d4293f2859a39008798cb08ac038f1b16e8e8bfde8e95e318eaaff3c3186ed" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.792818 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.851408 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bkf9g"] Jan 21 09:29:32 crc kubenswrapper[4618]: E0121 09:29:32.851912 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="registry-server" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.851943 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="registry-server" Jan 21 09:29:32 crc kubenswrapper[4618]: E0121 09:29:32.851955 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d61b58a-5231-47ee-8d01-2eb51a1def0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.851963 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d61b58a-5231-47ee-8d01-2eb51a1def0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:32 crc kubenswrapper[4618]: E0121 09:29:32.851981 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="extract-utilities" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.851987 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="extract-utilities" Jan 21 09:29:32 crc kubenswrapper[4618]: E0121 09:29:32.851997 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="extract-content" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.852003 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="extract-content" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.852228 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d61b58a-5231-47ee-8d01-2eb51a1def0c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.852249 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df0b108-558d-436b-8834-de4e5fbbdd68" containerName="registry-server" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.852988 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.856274 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.856530 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.856570 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.856679 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.860624 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bkf9g"] Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.880399 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.880547 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.880651 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6fd\" (UniqueName: \"kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.981722 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.981793 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6fd\" (UniqueName: \"kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.981907 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.985825 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.986127 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:32 crc kubenswrapper[4618]: I0121 09:29:32.998360 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6fd\" (UniqueName: \"kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd\") pod \"ssh-known-hosts-edpm-deployment-bkf9g\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:33 crc kubenswrapper[4618]: I0121 09:29:33.168716 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:33 crc kubenswrapper[4618]: I0121 09:29:33.664857 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bkf9g"] Jan 21 09:29:33 crc kubenswrapper[4618]: I0121 09:29:33.804782 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" event={"ID":"aed2c63b-6043-45ca-90ac-b445dc0112fe","Type":"ContainerStarted","Data":"92fbea033fc6220e85db518b9a7d5614c030f295c48088fd0bfaf365a758a398"} Jan 21 09:29:34 crc kubenswrapper[4618]: I0121 09:29:34.817979 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" event={"ID":"aed2c63b-6043-45ca-90ac-b445dc0112fe","Type":"ContainerStarted","Data":"44af7e50ffd9254db17894845b966ae22258633bdf352c59d4341ed172ca97db"} Jan 21 09:29:34 crc kubenswrapper[4618]: I0121 09:29:34.836916 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" podStartSLOduration=2.190427347 podStartE2EDuration="2.836886477s" podCreationTimestamp="2026-01-21 09:29:32 +0000 UTC" firstStartedPulling="2026-01-21 09:29:33.660797718 +0000 UTC m=+1572.411265035" lastFinishedPulling="2026-01-21 09:29:34.307256848 +0000 UTC m=+1573.057724165" observedRunningTime="2026-01-21 09:29:34.832231056 +0000 UTC m=+1573.582698393" watchObservedRunningTime="2026-01-21 09:29:34.836886477 +0000 UTC m=+1573.587353794" Jan 21 09:29:39 crc kubenswrapper[4618]: I0121 09:29:39.864306 4618 generic.go:334] "Generic (PLEG): container finished" podID="aed2c63b-6043-45ca-90ac-b445dc0112fe" containerID="44af7e50ffd9254db17894845b966ae22258633bdf352c59d4341ed172ca97db" exitCode=0 Jan 21 09:29:39 crc kubenswrapper[4618]: I0121 09:29:39.864371 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" event={"ID":"aed2c63b-6043-45ca-90ac-b445dc0112fe","Type":"ContainerDied","Data":"44af7e50ffd9254db17894845b966ae22258633bdf352c59d4341ed172ca97db"} Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.047528 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k47ps"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.059230 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wvhr2"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.076606 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-63df-account-create-update-sssqx"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.085063 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k47ps"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.092920 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c9fb5"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.099640 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wvhr2"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.104955 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d612-account-create-update-6whq9"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.110038 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9f33-account-create-update-nhlhp"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.114876 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-63df-account-create-update-sssqx"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.119765 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c9fb5"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.124462 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d612-account-create-update-6whq9"] Jan 21 09:29:40 crc kubenswrapper[4618]: I0121 09:29:40.129178 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9f33-account-create-update-nhlhp"] Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.193170 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.270978 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6fd\" (UniqueName: \"kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd\") pod \"aed2c63b-6043-45ca-90ac-b445dc0112fe\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.271242 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam\") pod \"aed2c63b-6043-45ca-90ac-b445dc0112fe\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.271281 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0\") pod \"aed2c63b-6043-45ca-90ac-b445dc0112fe\" (UID: \"aed2c63b-6043-45ca-90ac-b445dc0112fe\") " Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.276818 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd" (OuterVolumeSpecName: "kube-api-access-qs6fd") pod "aed2c63b-6043-45ca-90ac-b445dc0112fe" (UID: "aed2c63b-6043-45ca-90ac-b445dc0112fe"). InnerVolumeSpecName "kube-api-access-qs6fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.292629 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aed2c63b-6043-45ca-90ac-b445dc0112fe" (UID: "aed2c63b-6043-45ca-90ac-b445dc0112fe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.299934 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "aed2c63b-6043-45ca-90ac-b445dc0112fe" (UID: "aed2c63b-6043-45ca-90ac-b445dc0112fe"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.375451 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.375490 4618 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aed2c63b-6043-45ca-90ac-b445dc0112fe-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.375501 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6fd\" (UniqueName: \"kubernetes.io/projected/aed2c63b-6043-45ca-90ac-b445dc0112fe-kube-api-access-qs6fd\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.551569 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c89387-da4a-4f04-ae29-8acf81d8c18f" path="/var/lib/kubelet/pods/03c89387-da4a-4f04-ae29-8acf81d8c18f/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.552572 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9602ab-6ad6-4b2a-b142-5b646433ed19" path="/var/lib/kubelet/pods/1a9602ab-6ad6-4b2a-b142-5b646433ed19/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.553176 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6" path="/var/lib/kubelet/pods/86b5bcc6-7694-4b4b-b68d-44eaf7b90eb6/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.553691 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2cf19dc-4223-41c2-a849-d02a34917dad" path="/var/lib/kubelet/pods/b2cf19dc-4223-41c2-a849-d02a34917dad/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.554647 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b" path="/var/lib/kubelet/pods/bd7d4e53-e9cb-4e0d-b7c3-d89a441dbe4b/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.555196 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06dae48-dad2-4fc4-8083-5a83b1cb6eb7" path="/var/lib/kubelet/pods/c06dae48-dad2-4fc4-8083-5a83b1cb6eb7/volumes" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.885967 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" event={"ID":"aed2c63b-6043-45ca-90ac-b445dc0112fe","Type":"ContainerDied","Data":"92fbea033fc6220e85db518b9a7d5614c030f295c48088fd0bfaf365a758a398"} Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.886045 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92fbea033fc6220e85db518b9a7d5614c030f295c48088fd0bfaf365a758a398" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.886067 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bkf9g" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.934557 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr"] Jan 21 09:29:41 crc kubenswrapper[4618]: E0121 09:29:41.934979 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed2c63b-6043-45ca-90ac-b445dc0112fe" containerName="ssh-known-hosts-edpm-deployment" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.934997 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed2c63b-6043-45ca-90ac-b445dc0112fe" containerName="ssh-known-hosts-edpm-deployment" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.935193 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed2c63b-6043-45ca-90ac-b445dc0112fe" containerName="ssh-known-hosts-edpm-deployment" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.935792 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.940708 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.941211 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.941381 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.941503 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.944469 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr"] Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.988689 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.988837 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2t44\" (UniqueName: \"kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:41 crc kubenswrapper[4618]: I0121 09:29:41.988875 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.091405 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2t44\" (UniqueName: \"kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.091468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.091560 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.097199 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.097206 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.107262 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2t44\" (UniqueName: \"kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s4dgr\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.253494 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.744917 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr"] Jan 21 09:29:42 crc kubenswrapper[4618]: I0121 09:29:42.894614 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" event={"ID":"1ce37433-9d98-4388-8374-b3a26afdd1c3","Type":"ContainerStarted","Data":"45c006a4ebfabae1829709ee3fa521919ab7dab4bfcf5d95a6722702abf45401"} Jan 21 09:29:43 crc kubenswrapper[4618]: I0121 09:29:43.904376 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" event={"ID":"1ce37433-9d98-4388-8374-b3a26afdd1c3","Type":"ContainerStarted","Data":"51802832d25832440342b7e71f7623e2764fe30c71e203e806b6e101276e42a2"} Jan 21 09:29:43 crc kubenswrapper[4618]: I0121 09:29:43.919511 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" podStartSLOduration=2.459195696 podStartE2EDuration="2.919499645s" podCreationTimestamp="2026-01-21 09:29:41 +0000 UTC" firstStartedPulling="2026-01-21 09:29:42.749154092 +0000 UTC m=+1581.499621409" lastFinishedPulling="2026-01-21 09:29:43.209458041 +0000 UTC m=+1581.959925358" observedRunningTime="2026-01-21 09:29:43.918287897 +0000 UTC m=+1582.668755214" watchObservedRunningTime="2026-01-21 09:29:43.919499645 +0000 UTC m=+1582.669966963" Jan 21 09:29:49 crc kubenswrapper[4618]: I0121 09:29:49.959874 4618 generic.go:334] "Generic (PLEG): container finished" podID="1ce37433-9d98-4388-8374-b3a26afdd1c3" containerID="51802832d25832440342b7e71f7623e2764fe30c71e203e806b6e101276e42a2" exitCode=0 Jan 21 09:29:49 crc kubenswrapper[4618]: I0121 09:29:49.959986 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" event={"ID":"1ce37433-9d98-4388-8374-b3a26afdd1c3","Type":"ContainerDied","Data":"51802832d25832440342b7e71f7623e2764fe30c71e203e806b6e101276e42a2"} Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.276474 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.291914 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory\") pod \"1ce37433-9d98-4388-8374-b3a26afdd1c3\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.292036 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2t44\" (UniqueName: \"kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44\") pod \"1ce37433-9d98-4388-8374-b3a26afdd1c3\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.298092 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44" (OuterVolumeSpecName: "kube-api-access-m2t44") pod "1ce37433-9d98-4388-8374-b3a26afdd1c3" (UID: "1ce37433-9d98-4388-8374-b3a26afdd1c3"). InnerVolumeSpecName "kube-api-access-m2t44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.317666 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory" (OuterVolumeSpecName: "inventory") pod "1ce37433-9d98-4388-8374-b3a26afdd1c3" (UID: "1ce37433-9d98-4388-8374-b3a26afdd1c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.393447 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam\") pod \"1ce37433-9d98-4388-8374-b3a26afdd1c3\" (UID: \"1ce37433-9d98-4388-8374-b3a26afdd1c3\") " Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.394248 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.394269 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2t44\" (UniqueName: \"kubernetes.io/projected/1ce37433-9d98-4388-8374-b3a26afdd1c3-kube-api-access-m2t44\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.413277 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1ce37433-9d98-4388-8374-b3a26afdd1c3" (UID: "1ce37433-9d98-4388-8374-b3a26afdd1c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.496767 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ce37433-9d98-4388-8374-b3a26afdd1c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.982512 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" event={"ID":"1ce37433-9d98-4388-8374-b3a26afdd1c3","Type":"ContainerDied","Data":"45c006a4ebfabae1829709ee3fa521919ab7dab4bfcf5d95a6722702abf45401"} Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.982552 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c006a4ebfabae1829709ee3fa521919ab7dab4bfcf5d95a6722702abf45401" Jan 21 09:29:51 crc kubenswrapper[4618]: I0121 09:29:51.982894 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s4dgr" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.032837 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn"] Jan 21 09:29:52 crc kubenswrapper[4618]: E0121 09:29:52.033516 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce37433-9d98-4388-8374-b3a26afdd1c3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.033604 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce37433-9d98-4388-8374-b3a26afdd1c3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.033889 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce37433-9d98-4388-8374-b3a26afdd1c3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.034740 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.036707 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.036795 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.037037 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.037123 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.039559 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn"] Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.115016 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xt8l\" (UniqueName: \"kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.115173 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.115384 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.217512 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xt8l\" (UniqueName: \"kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.218001 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.218171 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.222886 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.223118 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.233598 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xt8l\" (UniqueName: \"kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.355749 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.866798 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn"] Jan 21 09:29:52 crc kubenswrapper[4618]: I0121 09:29:52.993586 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" event={"ID":"182b5ccb-0f34-47f6-b087-ceed41764dc6","Type":"ContainerStarted","Data":"556e535469f72822ba8db794ad89385de7b1e3f65dd96abd2616841d0f3ed3d5"} Jan 21 09:29:54 crc kubenswrapper[4618]: I0121 09:29:54.003246 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" event={"ID":"182b5ccb-0f34-47f6-b087-ceed41764dc6","Type":"ContainerStarted","Data":"667740925f2fc90f9ea441f392e3b6826085d6441086c4822160c39d5d130c3c"} Jan 21 09:29:54 crc kubenswrapper[4618]: I0121 09:29:54.020043 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" podStartSLOduration=1.509983485 podStartE2EDuration="2.020016244s" podCreationTimestamp="2026-01-21 09:29:52 +0000 UTC" firstStartedPulling="2026-01-21 09:29:52.868569021 +0000 UTC m=+1591.619036337" lastFinishedPulling="2026-01-21 09:29:53.378601779 +0000 UTC m=+1592.129069096" observedRunningTime="2026-01-21 09:29:54.016878305 +0000 UTC m=+1592.767345622" watchObservedRunningTime="2026-01-21 09:29:54.020016244 +0000 UTC m=+1592.770483562" Jan 21 09:29:57 crc kubenswrapper[4618]: I0121 09:29:57.036360 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-th4n8"] Jan 21 09:29:57 crc kubenswrapper[4618]: I0121 09:29:57.044519 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-th4n8"] Jan 21 09:29:57 crc kubenswrapper[4618]: I0121 09:29:57.559959 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc" path="/var/lib/kubelet/pods/1d3d0298-3dc5-4e2a-8f7a-6bf7a11bb1fc/volumes" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.139821 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq"] Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.142520 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.146006 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.146378 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.166585 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq"] Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.200947 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.201014 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsgq\" (UniqueName: \"kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.201163 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.303799 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.303947 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.303983 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsgq\" (UniqueName: \"kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.304775 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.317062 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.319645 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsgq\" (UniqueName: \"kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq\") pod \"collect-profiles-29483130-kf6jq\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.471049 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:00 crc kubenswrapper[4618]: I0121 09:30:00.869168 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq"] Jan 21 09:30:01 crc kubenswrapper[4618]: I0121 09:30:01.064548 4618 generic.go:334] "Generic (PLEG): container finished" podID="182b5ccb-0f34-47f6-b087-ceed41764dc6" containerID="667740925f2fc90f9ea441f392e3b6826085d6441086c4822160c39d5d130c3c" exitCode=0 Jan 21 09:30:01 crc kubenswrapper[4618]: I0121 09:30:01.064626 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" event={"ID":"182b5ccb-0f34-47f6-b087-ceed41764dc6","Type":"ContainerDied","Data":"667740925f2fc90f9ea441f392e3b6826085d6441086c4822160c39d5d130c3c"} Jan 21 09:30:01 crc kubenswrapper[4618]: I0121 09:30:01.066480 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" event={"ID":"81ad4c2e-2c01-49ee-97c9-689ff3fe3222","Type":"ContainerStarted","Data":"a534e012fd1a558b1c503abc4f06b1326defab6a3961a13cc3f24ed8b98ac538"} Jan 21 09:30:01 crc kubenswrapper[4618]: I0121 09:30:01.066526 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" event={"ID":"81ad4c2e-2c01-49ee-97c9-689ff3fe3222","Type":"ContainerStarted","Data":"8c33edf6cf547c74deb2aaf5e98457753661a53b84a59b42f04a4f14b60a6720"} Jan 21 09:30:01 crc kubenswrapper[4618]: I0121 09:30:01.091973 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" podStartSLOduration=1.091958893 podStartE2EDuration="1.091958893s" podCreationTimestamp="2026-01-21 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:30:01.091562137 +0000 UTC m=+1599.842029455" watchObservedRunningTime="2026-01-21 09:30:01.091958893 +0000 UTC m=+1599.842426210" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.079868 4618 generic.go:334] "Generic (PLEG): container finished" podID="81ad4c2e-2c01-49ee-97c9-689ff3fe3222" containerID="a534e012fd1a558b1c503abc4f06b1326defab6a3961a13cc3f24ed8b98ac538" exitCode=0 Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.079972 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" event={"ID":"81ad4c2e-2c01-49ee-97c9-689ff3fe3222","Type":"ContainerDied","Data":"a534e012fd1a558b1c503abc4f06b1326defab6a3961a13cc3f24ed8b98ac538"} Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.417736 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.453028 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xt8l\" (UniqueName: \"kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l\") pod \"182b5ccb-0f34-47f6-b087-ceed41764dc6\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.453178 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam\") pod \"182b5ccb-0f34-47f6-b087-ceed41764dc6\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.453250 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory\") pod \"182b5ccb-0f34-47f6-b087-ceed41764dc6\" (UID: \"182b5ccb-0f34-47f6-b087-ceed41764dc6\") " Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.460375 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l" (OuterVolumeSpecName: "kube-api-access-7xt8l") pod "182b5ccb-0f34-47f6-b087-ceed41764dc6" (UID: "182b5ccb-0f34-47f6-b087-ceed41764dc6"). InnerVolumeSpecName "kube-api-access-7xt8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.477835 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "182b5ccb-0f34-47f6-b087-ceed41764dc6" (UID: "182b5ccb-0f34-47f6-b087-ceed41764dc6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.484666 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory" (OuterVolumeSpecName: "inventory") pod "182b5ccb-0f34-47f6-b087-ceed41764dc6" (UID: "182b5ccb-0f34-47f6-b087-ceed41764dc6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.555046 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xt8l\" (UniqueName: \"kubernetes.io/projected/182b5ccb-0f34-47f6-b087-ceed41764dc6-kube-api-access-7xt8l\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.555082 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:02 crc kubenswrapper[4618]: I0121 09:30:02.555093 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/182b5ccb-0f34-47f6-b087-ceed41764dc6-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.088118 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.088099 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn" event={"ID":"182b5ccb-0f34-47f6-b087-ceed41764dc6","Type":"ContainerDied","Data":"556e535469f72822ba8db794ad89385de7b1e3f65dd96abd2616841d0f3ed3d5"} Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.088727 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556e535469f72822ba8db794ad89385de7b1e3f65dd96abd2616841d0f3ed3d5" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.166693 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc"] Jan 21 09:30:03 crc kubenswrapper[4618]: E0121 09:30:03.167259 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182b5ccb-0f34-47f6-b087-ceed41764dc6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.167327 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="182b5ccb-0f34-47f6-b087-ceed41764dc6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.167571 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="182b5ccb-0f34-47f6-b087-ceed41764dc6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.169053 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.171112 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.171306 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.172246 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.173606 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.173664 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.173701 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.173966 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.174159 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.176497 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc"] Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271722 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271778 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271813 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271837 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lbb\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271856 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271882 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.271903 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272065 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272191 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272225 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272245 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272263 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272338 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.272360 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374453 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374496 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374563 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374590 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374619 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374645 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lbb\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374662 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374689 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374710 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374744 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374769 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374791 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374808 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.374826 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.378840 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.379270 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.379340 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.379917 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.380111 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.380496 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.380857 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.381497 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.381672 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.381887 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.382055 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.382831 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.387404 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.390561 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lbb\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-476wc\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.452068 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.476629 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume\") pod \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.477259 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume\") pod \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.477570 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtsgq\" (UniqueName: \"kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq\") pod \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\" (UID: \"81ad4c2e-2c01-49ee-97c9-689ff3fe3222\") " Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.477626 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume" (OuterVolumeSpecName: "config-volume") pod "81ad4c2e-2c01-49ee-97c9-689ff3fe3222" (UID: "81ad4c2e-2c01-49ee-97c9-689ff3fe3222"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.478366 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.482400 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.483386 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81ad4c2e-2c01-49ee-97c9-689ff3fe3222" (UID: "81ad4c2e-2c01-49ee-97c9-689ff3fe3222"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.483480 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq" (OuterVolumeSpecName: "kube-api-access-rtsgq") pod "81ad4c2e-2c01-49ee-97c9-689ff3fe3222" (UID: "81ad4c2e-2c01-49ee-97c9-689ff3fe3222"). InnerVolumeSpecName "kube-api-access-rtsgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.581203 4618 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.581255 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtsgq\" (UniqueName: \"kubernetes.io/projected/81ad4c2e-2c01-49ee-97c9-689ff3fe3222-kube-api-access-rtsgq\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:03 crc kubenswrapper[4618]: I0121 09:30:03.953590 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc"] Jan 21 09:30:03 crc kubenswrapper[4618]: W0121 09:30:03.953707 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefaee91b_ca19_44cc_b8b4_37f6bf34067a.slice/crio-25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a WatchSource:0}: Error finding container 25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a: Status 404 returned error can't find the container with id 25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a Jan 21 09:30:04 crc kubenswrapper[4618]: I0121 09:30:04.097841 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" Jan 21 09:30:04 crc kubenswrapper[4618]: I0121 09:30:04.097842 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483130-kf6jq" event={"ID":"81ad4c2e-2c01-49ee-97c9-689ff3fe3222","Type":"ContainerDied","Data":"8c33edf6cf547c74deb2aaf5e98457753661a53b84a59b42f04a4f14b60a6720"} Jan 21 09:30:04 crc kubenswrapper[4618]: I0121 09:30:04.097934 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c33edf6cf547c74deb2aaf5e98457753661a53b84a59b42f04a4f14b60a6720" Jan 21 09:30:04 crc kubenswrapper[4618]: I0121 09:30:04.099546 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" event={"ID":"efaee91b-ca19-44cc-b8b4-37f6bf34067a","Type":"ContainerStarted","Data":"25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a"} Jan 21 09:30:05 crc kubenswrapper[4618]: I0121 09:30:05.110702 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" event={"ID":"efaee91b-ca19-44cc-b8b4-37f6bf34067a","Type":"ContainerStarted","Data":"6197174d97062d5e5d2eade7d7ac66d3a5bb099fcf77281aa36b953ef47b3d29"} Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.119608 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" podStartSLOduration=3.564215533 podStartE2EDuration="4.119570056s" podCreationTimestamp="2026-01-21 09:30:03 +0000 UTC" firstStartedPulling="2026-01-21 09:30:03.956021133 +0000 UTC m=+1602.706488451" lastFinishedPulling="2026-01-21 09:30:04.511375656 +0000 UTC m=+1603.261842974" observedRunningTime="2026-01-21 09:30:05.140542926 +0000 UTC m=+1603.891010243" watchObservedRunningTime="2026-01-21 09:30:07.119570056 +0000 UTC m=+1605.870037372" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.128442 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:07 crc kubenswrapper[4618]: E0121 09:30:07.128910 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ad4c2e-2c01-49ee-97c9-689ff3fe3222" containerName="collect-profiles" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.128933 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ad4c2e-2c01-49ee-97c9-689ff3fe3222" containerName="collect-profiles" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.129100 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ad4c2e-2c01-49ee-97c9-689ff3fe3222" containerName="collect-profiles" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.130342 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.146708 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.160284 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.160359 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.160395 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzmz\" (UniqueName: \"kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.261712 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.261794 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.261845 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzmz\" (UniqueName: \"kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.262223 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.262300 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.282494 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzmz\" (UniqueName: \"kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz\") pod \"community-operators-97vq7\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.445014 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:07 crc kubenswrapper[4618]: I0121 09:30:07.882671 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:07 crc kubenswrapper[4618]: W0121 09:30:07.884795 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261c4f02_14ba_4f25_8f82_a10da73a16f8.slice/crio-6aa4e94b51b2803a97e50a8a3b1856487bc0d94c439c495d46f13ea97ddfacc7 WatchSource:0}: Error finding container 6aa4e94b51b2803a97e50a8a3b1856487bc0d94c439c495d46f13ea97ddfacc7: Status 404 returned error can't find the container with id 6aa4e94b51b2803a97e50a8a3b1856487bc0d94c439c495d46f13ea97ddfacc7 Jan 21 09:30:08 crc kubenswrapper[4618]: I0121 09:30:08.134843 4618 generic.go:334] "Generic (PLEG): container finished" podID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerID="2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9" exitCode=0 Jan 21 09:30:08 crc kubenswrapper[4618]: I0121 09:30:08.134946 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerDied","Data":"2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9"} Jan 21 09:30:08 crc kubenswrapper[4618]: I0121 09:30:08.135084 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerStarted","Data":"6aa4e94b51b2803a97e50a8a3b1856487bc0d94c439c495d46f13ea97ddfacc7"} Jan 21 09:30:09 crc kubenswrapper[4618]: I0121 09:30:09.147534 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerStarted","Data":"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326"} Jan 21 09:30:10 crc kubenswrapper[4618]: I0121 09:30:10.159279 4618 generic.go:334] "Generic (PLEG): container finished" podID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerID="029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326" exitCode=0 Jan 21 09:30:10 crc kubenswrapper[4618]: I0121 09:30:10.159331 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerDied","Data":"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326"} Jan 21 09:30:11 crc kubenswrapper[4618]: I0121 09:30:11.169472 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerStarted","Data":"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1"} Jan 21 09:30:11 crc kubenswrapper[4618]: I0121 09:30:11.192356 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-97vq7" podStartSLOduration=1.6590933300000001 podStartE2EDuration="4.192340469s" podCreationTimestamp="2026-01-21 09:30:07 +0000 UTC" firstStartedPulling="2026-01-21 09:30:08.136813317 +0000 UTC m=+1606.887280634" lastFinishedPulling="2026-01-21 09:30:10.670060456 +0000 UTC m=+1609.420527773" observedRunningTime="2026-01-21 09:30:11.186070792 +0000 UTC m=+1609.936538109" watchObservedRunningTime="2026-01-21 09:30:11.192340469 +0000 UTC m=+1609.942807786" Jan 21 09:30:12 crc kubenswrapper[4618]: I0121 09:30:12.024671 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mzrwt"] Jan 21 09:30:12 crc kubenswrapper[4618]: I0121 09:30:12.031402 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mzrwt"] Jan 21 09:30:13 crc kubenswrapper[4618]: I0121 09:30:13.025029 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gl84v"] Jan 21 09:30:13 crc kubenswrapper[4618]: I0121 09:30:13.030455 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gl84v"] Jan 21 09:30:13 crc kubenswrapper[4618]: I0121 09:30:13.548487 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180187c8-6cea-49da-86c8-b0709d20403d" path="/var/lib/kubelet/pods/180187c8-6cea-49da-86c8-b0709d20403d/volumes" Jan 21 09:30:13 crc kubenswrapper[4618]: I0121 09:30:13.549549 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fa4156-22b0-45b2-b300-e7a4b768964a" path="/var/lib/kubelet/pods/f3fa4156-22b0-45b2-b300-e7a4b768964a/volumes" Jan 21 09:30:17 crc kubenswrapper[4618]: I0121 09:30:17.445168 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:17 crc kubenswrapper[4618]: I0121 09:30:17.445580 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:17 crc kubenswrapper[4618]: I0121 09:30:17.485562 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:18 crc kubenswrapper[4618]: I0121 09:30:18.267675 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:18 crc kubenswrapper[4618]: I0121 09:30:18.315475 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.247430 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-97vq7" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="registry-server" containerID="cri-o://bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1" gracePeriod=2 Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.663211 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.760229 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content\") pod \"261c4f02-14ba-4f25-8f82-a10da73a16f8\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.760432 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jzmz\" (UniqueName: \"kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz\") pod \"261c4f02-14ba-4f25-8f82-a10da73a16f8\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.760478 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities\") pod \"261c4f02-14ba-4f25-8f82-a10da73a16f8\" (UID: \"261c4f02-14ba-4f25-8f82-a10da73a16f8\") " Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.761342 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities" (OuterVolumeSpecName: "utilities") pod "261c4f02-14ba-4f25-8f82-a10da73a16f8" (UID: "261c4f02-14ba-4f25-8f82-a10da73a16f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.767415 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz" (OuterVolumeSpecName: "kube-api-access-5jzmz") pod "261c4f02-14ba-4f25-8f82-a10da73a16f8" (UID: "261c4f02-14ba-4f25-8f82-a10da73a16f8"). InnerVolumeSpecName "kube-api-access-5jzmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.806632 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "261c4f02-14ba-4f25-8f82-a10da73a16f8" (UID: "261c4f02-14ba-4f25-8f82-a10da73a16f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.863696 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.863747 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jzmz\" (UniqueName: \"kubernetes.io/projected/261c4f02-14ba-4f25-8f82-a10da73a16f8-kube-api-access-5jzmz\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:20 crc kubenswrapper[4618]: I0121 09:30:20.863763 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/261c4f02-14ba-4f25-8f82-a10da73a16f8-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.261041 4618 generic.go:334] "Generic (PLEG): container finished" podID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerID="bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1" exitCode=0 Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.261096 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerDied","Data":"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1"} Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.261134 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97vq7" event={"ID":"261c4f02-14ba-4f25-8f82-a10da73a16f8","Type":"ContainerDied","Data":"6aa4e94b51b2803a97e50a8a3b1856487bc0d94c439c495d46f13ea97ddfacc7"} Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.261166 4618 scope.go:117] "RemoveContainer" containerID="bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.261289 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97vq7" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.288360 4618 scope.go:117] "RemoveContainer" containerID="029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.294154 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.299088 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-97vq7"] Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.309278 4618 scope.go:117] "RemoveContainer" containerID="2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.339832 4618 scope.go:117] "RemoveContainer" containerID="bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1" Jan 21 09:30:21 crc kubenswrapper[4618]: E0121 09:30:21.340259 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1\": container with ID starting with bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1 not found: ID does not exist" containerID="bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.340287 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1"} err="failed to get container status \"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1\": rpc error: code = NotFound desc = could not find container \"bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1\": container with ID starting with bddba825240b7634a3cc3f0e5dcebcd0a6de7a22616a9d31ab9bffa0c06082e1 not found: ID does not exist" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.340305 4618 scope.go:117] "RemoveContainer" containerID="029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326" Jan 21 09:30:21 crc kubenswrapper[4618]: E0121 09:30:21.340623 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326\": container with ID starting with 029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326 not found: ID does not exist" containerID="029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.340648 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326"} err="failed to get container status \"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326\": rpc error: code = NotFound desc = could not find container \"029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326\": container with ID starting with 029c0c913790e5fe1b54a3b708dfc69291d7d8bdbeed686d3cc9d6d0f68e7326 not found: ID does not exist" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.340670 4618 scope.go:117] "RemoveContainer" containerID="2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9" Jan 21 09:30:21 crc kubenswrapper[4618]: E0121 09:30:21.341365 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9\": container with ID starting with 2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9 not found: ID does not exist" containerID="2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.341422 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9"} err="failed to get container status \"2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9\": rpc error: code = NotFound desc = could not find container \"2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9\": container with ID starting with 2bdac214590131f68bd5531de1b352f6e2da3346f72f134d3fe0b9246b13d6c9 not found: ID does not exist" Jan 21 09:30:21 crc kubenswrapper[4618]: I0121 09:30:21.549125 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" path="/var/lib/kubelet/pods/261c4f02-14ba-4f25-8f82-a10da73a16f8/volumes" Jan 21 09:30:26 crc kubenswrapper[4618]: I0121 09:30:26.958782 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:30:26 crc kubenswrapper[4618]: I0121 09:30:26.959415 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.530473 4618 scope.go:117] "RemoveContainer" containerID="c68d8e997c123e5fc8bccc47d13892131b96bbafa56501a9813c2fd85303e947" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.557691 4618 scope.go:117] "RemoveContainer" containerID="b79c442395a0ab91161432ef7ec7c224815b7560661146b28dfd2f0b167f7388" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.598429 4618 scope.go:117] "RemoveContainer" containerID="172bfda592ead090abe21a7d722410a375184f3b8ca826356e0689144a64e370" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.642041 4618 scope.go:117] "RemoveContainer" containerID="0abcf3420c84426b5c3032545f83baf2a312536ae71853c86b42c0c18c469ba2" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.660657 4618 scope.go:117] "RemoveContainer" containerID="68008014429ca04cf8d1c188b218e685929dcdba214684e5658bfd4718a27db8" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.695681 4618 scope.go:117] "RemoveContainer" containerID="4b2ac550cd82c2011c2aff516a4b46f15c57aa85c4fd4772bf7a789dfd4ea162" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.734273 4618 scope.go:117] "RemoveContainer" containerID="f6950cb02ef07e93504da2db387baf84efb0e5ade66a6dc20e96b3da9cea4692" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.755955 4618 scope.go:117] "RemoveContainer" containerID="d1cdfd5ad8454b9f8817f40bb774fc0ea3c8449535026c0c00aaf0aa372de33a" Jan 21 09:30:28 crc kubenswrapper[4618]: I0121 09:30:28.801190 4618 scope.go:117] "RemoveContainer" containerID="c926bae56d493038364720fdbe87a9f55b9377cd0c66f33d09c541c0cc66af11" Jan 21 09:30:31 crc kubenswrapper[4618]: I0121 09:30:31.350797 4618 generic.go:334] "Generic (PLEG): container finished" podID="efaee91b-ca19-44cc-b8b4-37f6bf34067a" containerID="6197174d97062d5e5d2eade7d7ac66d3a5bb099fcf77281aa36b953ef47b3d29" exitCode=0 Jan 21 09:30:31 crc kubenswrapper[4618]: I0121 09:30:31.350895 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" event={"ID":"efaee91b-ca19-44cc-b8b4-37f6bf34067a","Type":"ContainerDied","Data":"6197174d97062d5e5d2eade7d7ac66d3a5bb099fcf77281aa36b953ef47b3d29"} Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.689534 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.798498 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.798689 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.798784 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.798875 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799002 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799103 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799215 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799291 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799365 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799507 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799601 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799700 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799784 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lbb\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.799885 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle\") pod \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\" (UID: \"efaee91b-ca19-44cc-b8b4-37f6bf34067a\") " Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.803948 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.804675 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.804712 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.805128 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.805554 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.806458 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.806541 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.806779 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb" (OuterVolumeSpecName: "kube-api-access-99lbb") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "kube-api-access-99lbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.807183 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.807345 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.807468 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.808428 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.823052 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory" (OuterVolumeSpecName: "inventory") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.824712 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "efaee91b-ca19-44cc-b8b4-37f6bf34067a" (UID: "efaee91b-ca19-44cc-b8b4-37f6bf34067a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902077 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902105 4618 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902116 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902126 4618 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902168 4618 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902182 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902193 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902203 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902213 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lbb\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-kube-api-access-99lbb\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902222 4618 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902231 4618 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902241 4618 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902252 4618 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efaee91b-ca19-44cc-b8b4-37f6bf34067a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:32 crc kubenswrapper[4618]: I0121 09:30:32.902261 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efaee91b-ca19-44cc-b8b4-37f6bf34067a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.373265 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" event={"ID":"efaee91b-ca19-44cc-b8b4-37f6bf34067a","Type":"ContainerDied","Data":"25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a"} Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.373336 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fc92f6586b24c2321da0e92b128ffe77552b36d662f3763cdf88d173452b6a" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.373394 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-476wc" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.446192 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp"] Jan 21 09:30:33 crc kubenswrapper[4618]: E0121 09:30:33.446864 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="extract-content" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.446884 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="extract-content" Jan 21 09:30:33 crc kubenswrapper[4618]: E0121 09:30:33.446901 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efaee91b-ca19-44cc-b8b4-37f6bf34067a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.446909 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="efaee91b-ca19-44cc-b8b4-37f6bf34067a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:33 crc kubenswrapper[4618]: E0121 09:30:33.446940 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="extract-utilities" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.446948 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="extract-utilities" Jan 21 09:30:33 crc kubenswrapper[4618]: E0121 09:30:33.446964 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="registry-server" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.446969 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="registry-server" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.447199 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="efaee91b-ca19-44cc-b8b4-37f6bf34067a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.447226 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="261c4f02-14ba-4f25-8f82-a10da73a16f8" containerName="registry-server" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.447951 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.449870 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.450252 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.450293 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.450321 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.452935 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.456589 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp"] Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.512249 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.512322 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.512437 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.512486 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.512533 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxdb\" (UniqueName: \"kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.613982 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.614067 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.614107 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxdb\" (UniqueName: \"kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.614135 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.614192 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.615083 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.619008 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.619407 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.619817 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.629860 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxdb\" (UniqueName: \"kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-25xdp\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:33 crc kubenswrapper[4618]: I0121 09:30:33.765604 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:30:34 crc kubenswrapper[4618]: I0121 09:30:34.236434 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp"] Jan 21 09:30:34 crc kubenswrapper[4618]: I0121 09:30:34.382547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" event={"ID":"28d56297-035c-4b19-8135-4d63d60b9b62","Type":"ContainerStarted","Data":"b45bfec0d9897012dda4306222884320ff1850ddf3754c0b4f4c1b9e335e4b42"} Jan 21 09:30:35 crc kubenswrapper[4618]: I0121 09:30:35.390318 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" event={"ID":"28d56297-035c-4b19-8135-4d63d60b9b62","Type":"ContainerStarted","Data":"a691019d62ab3bfebf09a7b87bb0483935c48381a257703fca6f5f777b827f6c"} Jan 21 09:30:35 crc kubenswrapper[4618]: I0121 09:30:35.407559 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" podStartSLOduration=1.8179609220000001 podStartE2EDuration="2.407545139s" podCreationTimestamp="2026-01-21 09:30:33 +0000 UTC" firstStartedPulling="2026-01-21 09:30:34.238999863 +0000 UTC m=+1632.989467181" lastFinishedPulling="2026-01-21 09:30:34.828584081 +0000 UTC m=+1633.579051398" observedRunningTime="2026-01-21 09:30:35.406507608 +0000 UTC m=+1634.156974935" watchObservedRunningTime="2026-01-21 09:30:35.407545139 +0000 UTC m=+1634.158012456" Jan 21 09:30:56 crc kubenswrapper[4618]: I0121 09:30:56.959421 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:30:56 crc kubenswrapper[4618]: I0121 09:30:56.960068 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:30:57 crc kubenswrapper[4618]: I0121 09:30:57.032960 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6qr5n"] Jan 21 09:30:57 crc kubenswrapper[4618]: I0121 09:30:57.038947 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6qr5n"] Jan 21 09:30:57 crc kubenswrapper[4618]: I0121 09:30:57.555709 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd873f7b-8ec3-44de-85f7-073977049c57" path="/var/lib/kubelet/pods/cd873f7b-8ec3-44de-85f7-073977049c57/volumes" Jan 21 09:31:19 crc kubenswrapper[4618]: I0121 09:31:19.736889 4618 generic.go:334] "Generic (PLEG): container finished" podID="28d56297-035c-4b19-8135-4d63d60b9b62" containerID="a691019d62ab3bfebf09a7b87bb0483935c48381a257703fca6f5f777b827f6c" exitCode=0 Jan 21 09:31:19 crc kubenswrapper[4618]: I0121 09:31:19.736989 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" event={"ID":"28d56297-035c-4b19-8135-4d63d60b9b62","Type":"ContainerDied","Data":"a691019d62ab3bfebf09a7b87bb0483935c48381a257703fca6f5f777b827f6c"} Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.089395 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.241224 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0\") pod \"28d56297-035c-4b19-8135-4d63d60b9b62\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.241327 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam\") pod \"28d56297-035c-4b19-8135-4d63d60b9b62\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.241511 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle\") pod \"28d56297-035c-4b19-8135-4d63d60b9b62\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.241596 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory\") pod \"28d56297-035c-4b19-8135-4d63d60b9b62\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.241763 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxdb\" (UniqueName: \"kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb\") pod \"28d56297-035c-4b19-8135-4d63d60b9b62\" (UID: \"28d56297-035c-4b19-8135-4d63d60b9b62\") " Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.247022 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "28d56297-035c-4b19-8135-4d63d60b9b62" (UID: "28d56297-035c-4b19-8135-4d63d60b9b62"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.247220 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb" (OuterVolumeSpecName: "kube-api-access-lqxdb") pod "28d56297-035c-4b19-8135-4d63d60b9b62" (UID: "28d56297-035c-4b19-8135-4d63d60b9b62"). InnerVolumeSpecName "kube-api-access-lqxdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.260708 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "28d56297-035c-4b19-8135-4d63d60b9b62" (UID: "28d56297-035c-4b19-8135-4d63d60b9b62"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.265323 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28d56297-035c-4b19-8135-4d63d60b9b62" (UID: "28d56297-035c-4b19-8135-4d63d60b9b62"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.265649 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory" (OuterVolumeSpecName: "inventory") pod "28d56297-035c-4b19-8135-4d63d60b9b62" (UID: "28d56297-035c-4b19-8135-4d63d60b9b62"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.344984 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxdb\" (UniqueName: \"kubernetes.io/projected/28d56297-035c-4b19-8135-4d63d60b9b62-kube-api-access-lqxdb\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.345016 4618 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28d56297-035c-4b19-8135-4d63d60b9b62-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.345028 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.345039 4618 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.345051 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28d56297-035c-4b19-8135-4d63d60b9b62-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.753193 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" event={"ID":"28d56297-035c-4b19-8135-4d63d60b9b62","Type":"ContainerDied","Data":"b45bfec0d9897012dda4306222884320ff1850ddf3754c0b4f4c1b9e335e4b42"} Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.753241 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45bfec0d9897012dda4306222884320ff1850ddf3754c0b4f4c1b9e335e4b42" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.753298 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-25xdp" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.832185 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm"] Jan 21 09:31:21 crc kubenswrapper[4618]: E0121 09:31:21.832667 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d56297-035c-4b19-8135-4d63d60b9b62" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.832687 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d56297-035c-4b19-8135-4d63d60b9b62" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.832896 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d56297-035c-4b19-8135-4d63d60b9b62" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.833614 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.837861 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.838002 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.838201 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.838234 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.838407 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.839718 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.841014 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm"] Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.854985 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.855028 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.855110 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.855273 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq2zz\" (UniqueName: \"kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.855323 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.855352 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.957383 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq2zz\" (UniqueName: \"kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.957751 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.957779 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.957874 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.957900 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.958945 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.963328 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.963344 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.963444 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.963794 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.964636 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:21 crc kubenswrapper[4618]: I0121 09:31:21.973929 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq2zz\" (UniqueName: \"kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:22 crc kubenswrapper[4618]: I0121 09:31:22.147286 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:22 crc kubenswrapper[4618]: I0121 09:31:22.609200 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm"] Jan 21 09:31:22 crc kubenswrapper[4618]: I0121 09:31:22.765009 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" event={"ID":"0bbeab64-b3ee-4412-a66b-c5871248bddb","Type":"ContainerStarted","Data":"249e86a01b7e26f407a7bb1af7644a7e7db330c5cdcd8353277777ef157cbb41"} Jan 21 09:31:23 crc kubenswrapper[4618]: I0121 09:31:23.777575 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" event={"ID":"0bbeab64-b3ee-4412-a66b-c5871248bddb","Type":"ContainerStarted","Data":"3026661c6f4bbc0df68dbd3f8598e6b5ed101d12bb00ad80809660b6a148d802"} Jan 21 09:31:23 crc kubenswrapper[4618]: I0121 09:31:23.800343 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" podStartSLOduration=2.276559772 podStartE2EDuration="2.800320829s" podCreationTimestamp="2026-01-21 09:31:21 +0000 UTC" firstStartedPulling="2026-01-21 09:31:22.614756793 +0000 UTC m=+1681.365224110" lastFinishedPulling="2026-01-21 09:31:23.13851785 +0000 UTC m=+1681.888985167" observedRunningTime="2026-01-21 09:31:23.792875154 +0000 UTC m=+1682.543342471" watchObservedRunningTime="2026-01-21 09:31:23.800320829 +0000 UTC m=+1682.550788146" Jan 21 09:31:26 crc kubenswrapper[4618]: I0121 09:31:26.959445 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:31:26 crc kubenswrapper[4618]: I0121 09:31:26.960164 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:31:26 crc kubenswrapper[4618]: I0121 09:31:26.960226 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:31:26 crc kubenswrapper[4618]: I0121 09:31:26.960755 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:31:26 crc kubenswrapper[4618]: I0121 09:31:26.960813 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" gracePeriod=600 Jan 21 09:31:27 crc kubenswrapper[4618]: E0121 09:31:27.080831 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:31:27 crc kubenswrapper[4618]: I0121 09:31:27.824859 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" exitCode=0 Jan 21 09:31:27 crc kubenswrapper[4618]: I0121 09:31:27.824949 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c"} Jan 21 09:31:27 crc kubenswrapper[4618]: I0121 09:31:27.825237 4618 scope.go:117] "RemoveContainer" containerID="bd5991d4f6a04d2792fe93fca75aa63db48c790a76522c4fc7b2da7178ac6df6" Jan 21 09:31:27 crc kubenswrapper[4618]: I0121 09:31:27.826096 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:31:27 crc kubenswrapper[4618]: E0121 09:31:27.826382 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:31:28 crc kubenswrapper[4618]: I0121 09:31:28.961261 4618 scope.go:117] "RemoveContainer" containerID="9166ef3a70452b870b8baf61b10cfd450d3a7ab1e52f13b98739fb31de4d04d7" Jan 21 09:31:40 crc kubenswrapper[4618]: I0121 09:31:40.537888 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:31:40 crc kubenswrapper[4618]: E0121 09:31:40.538598 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:31:55 crc kubenswrapper[4618]: I0121 09:31:55.540323 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:31:55 crc kubenswrapper[4618]: E0121 09:31:55.545316 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:31:58 crc kubenswrapper[4618]: I0121 09:31:58.090195 4618 generic.go:334] "Generic (PLEG): container finished" podID="0bbeab64-b3ee-4412-a66b-c5871248bddb" containerID="3026661c6f4bbc0df68dbd3f8598e6b5ed101d12bb00ad80809660b6a148d802" exitCode=0 Jan 21 09:31:58 crc kubenswrapper[4618]: I0121 09:31:58.090270 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" event={"ID":"0bbeab64-b3ee-4412-a66b-c5871248bddb","Type":"ContainerDied","Data":"3026661c6f4bbc0df68dbd3f8598e6b5ed101d12bb00ad80809660b6a148d802"} Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.437997 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474347 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474418 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq2zz\" (UniqueName: \"kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474452 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474566 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474652 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.474705 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory\") pod \"0bbeab64-b3ee-4412-a66b-c5871248bddb\" (UID: \"0bbeab64-b3ee-4412-a66b-c5871248bddb\") " Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.480668 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz" (OuterVolumeSpecName: "kube-api-access-rq2zz") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "kube-api-access-rq2zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.480708 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.497584 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.497628 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.500299 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory" (OuterVolumeSpecName: "inventory") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.501679 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0bbeab64-b3ee-4412-a66b-c5871248bddb" (UID: "0bbeab64-b3ee-4412-a66b-c5871248bddb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576770 4618 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576798 4618 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576811 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576821 4618 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576833 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq2zz\" (UniqueName: \"kubernetes.io/projected/0bbeab64-b3ee-4412-a66b-c5871248bddb-kube-api-access-rq2zz\") on node \"crc\" DevicePath \"\"" Jan 21 09:31:59 crc kubenswrapper[4618]: I0121 09:31:59.576843 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bbeab64-b3ee-4412-a66b-c5871248bddb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.107186 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" event={"ID":"0bbeab64-b3ee-4412-a66b-c5871248bddb","Type":"ContainerDied","Data":"249e86a01b7e26f407a7bb1af7644a7e7db330c5cdcd8353277777ef157cbb41"} Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.107239 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249e86a01b7e26f407a7bb1af7644a7e7db330c5cdcd8353277777ef157cbb41" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.107361 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.191273 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr"] Jan 21 09:32:00 crc kubenswrapper[4618]: E0121 09:32:00.192374 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbeab64-b3ee-4412-a66b-c5871248bddb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.192409 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbeab64-b3ee-4412-a66b-c5871248bddb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.192724 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbeab64-b3ee-4412-a66b-c5871248bddb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.193849 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.196472 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.196895 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.197071 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.197169 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.197191 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.201902 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr"] Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.287362 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jz42\" (UniqueName: \"kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.287409 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.287541 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.287585 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.287748 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.390433 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jz42\" (UniqueName: \"kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.390492 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.390570 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.390597 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.390954 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.394052 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.394221 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.394443 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.394964 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.405892 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jz42\" (UniqueName: \"kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.509048 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:32:00 crc kubenswrapper[4618]: I0121 09:32:00.992543 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr"] Jan 21 09:32:01 crc kubenswrapper[4618]: I0121 09:32:01.117464 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" event={"ID":"5ff62cc0-5880-4589-ac86-a671f9533ff4","Type":"ContainerStarted","Data":"ffc38dd6cb8ad1a8b7980dbf0510ac38b540fdc2d144619edd1d335e90752a64"} Jan 21 09:32:02 crc kubenswrapper[4618]: I0121 09:32:02.129084 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" event={"ID":"5ff62cc0-5880-4589-ac86-a671f9533ff4","Type":"ContainerStarted","Data":"d6df21be9b8a950b0326b2072858830bfa8d10f75dbdf6bbe800db5d36f9f4c7"} Jan 21 09:32:02 crc kubenswrapper[4618]: I0121 09:32:02.147346 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" podStartSLOduration=1.567813482 podStartE2EDuration="2.147331658s" podCreationTimestamp="2026-01-21 09:32:00 +0000 UTC" firstStartedPulling="2026-01-21 09:32:00.999768743 +0000 UTC m=+1719.750236060" lastFinishedPulling="2026-01-21 09:32:01.579286919 +0000 UTC m=+1720.329754236" observedRunningTime="2026-01-21 09:32:02.143373146 +0000 UTC m=+1720.893840463" watchObservedRunningTime="2026-01-21 09:32:02.147331658 +0000 UTC m=+1720.897798975" Jan 21 09:32:08 crc kubenswrapper[4618]: I0121 09:32:08.539087 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:32:08 crc kubenswrapper[4618]: E0121 09:32:08.540447 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:32:20 crc kubenswrapper[4618]: I0121 09:32:20.537778 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:32:20 crc kubenswrapper[4618]: E0121 09:32:20.538705 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:32:31 crc kubenswrapper[4618]: I0121 09:32:31.545214 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:32:31 crc kubenswrapper[4618]: E0121 09:32:31.546659 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:32:45 crc kubenswrapper[4618]: I0121 09:32:45.540975 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:32:45 crc kubenswrapper[4618]: E0121 09:32:45.543561 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:32:58 crc kubenswrapper[4618]: I0121 09:32:58.538550 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:32:58 crc kubenswrapper[4618]: E0121 09:32:58.539617 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:33:11 crc kubenswrapper[4618]: I0121 09:33:11.542881 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:33:11 crc kubenswrapper[4618]: E0121 09:33:11.543678 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:33:25 crc kubenswrapper[4618]: I0121 09:33:25.537944 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:33:25 crc kubenswrapper[4618]: E0121 09:33:25.538834 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:33:40 crc kubenswrapper[4618]: I0121 09:33:40.538125 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:33:40 crc kubenswrapper[4618]: E0121 09:33:40.538882 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.667782 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.670246 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.675324 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.849319 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.849371 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.850400 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwrtz\" (UniqueName: \"kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.952010 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwrtz\" (UniqueName: \"kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.952165 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.952186 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.952648 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.953664 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.972055 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwrtz\" (UniqueName: \"kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz\") pod \"certified-operators-2bgjp\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:41 crc kubenswrapper[4618]: I0121 09:33:41.993438 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:42 crc kubenswrapper[4618]: I0121 09:33:42.252121 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:42 crc kubenswrapper[4618]: I0121 09:33:42.937052 4618 generic.go:334] "Generic (PLEG): container finished" podID="8282622f-0957-4284-9687-018b29cd9938" containerID="7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35" exitCode=0 Jan 21 09:33:42 crc kubenswrapper[4618]: I0121 09:33:42.937156 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerDied","Data":"7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35"} Jan 21 09:33:42 crc kubenswrapper[4618]: I0121 09:33:42.937440 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerStarted","Data":"bad19b443c67440a594c75fae5f376630e3bda9b84fc50f8988de593d928e8cc"} Jan 21 09:33:42 crc kubenswrapper[4618]: I0121 09:33:42.940125 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:33:43 crc kubenswrapper[4618]: I0121 09:33:43.954212 4618 generic.go:334] "Generic (PLEG): container finished" podID="8282622f-0957-4284-9687-018b29cd9938" containerID="69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3" exitCode=0 Jan 21 09:33:43 crc kubenswrapper[4618]: I0121 09:33:43.954283 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerDied","Data":"69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3"} Jan 21 09:33:44 crc kubenswrapper[4618]: I0121 09:33:44.964019 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerStarted","Data":"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9"} Jan 21 09:33:44 crc kubenswrapper[4618]: I0121 09:33:44.978825 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2bgjp" podStartSLOduration=2.465305541 podStartE2EDuration="3.97881464s" podCreationTimestamp="2026-01-21 09:33:41 +0000 UTC" firstStartedPulling="2026-01-21 09:33:42.939869817 +0000 UTC m=+1821.690337134" lastFinishedPulling="2026-01-21 09:33:44.453378916 +0000 UTC m=+1823.203846233" observedRunningTime="2026-01-21 09:33:44.977619693 +0000 UTC m=+1823.728087011" watchObservedRunningTime="2026-01-21 09:33:44.97881464 +0000 UTC m=+1823.729281958" Jan 21 09:33:51 crc kubenswrapper[4618]: I0121 09:33:51.994369 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:51 crc kubenswrapper[4618]: I0121 09:33:51.995070 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:52 crc kubenswrapper[4618]: I0121 09:33:52.032527 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:52 crc kubenswrapper[4618]: I0121 09:33:52.070826 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:52 crc kubenswrapper[4618]: I0121 09:33:52.259606 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:53 crc kubenswrapper[4618]: I0121 09:33:53.538010 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:33:53 crc kubenswrapper[4618]: E0121 09:33:53.538604 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:33:54 crc kubenswrapper[4618]: I0121 09:33:54.024613 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2bgjp" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="registry-server" containerID="cri-o://707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9" gracePeriod=2 Jan 21 09:33:54 crc kubenswrapper[4618]: I0121 09:33:54.883088 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038352 4618 generic.go:334] "Generic (PLEG): container finished" podID="8282622f-0957-4284-9687-018b29cd9938" containerID="707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9" exitCode=0 Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038405 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerDied","Data":"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9"} Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038442 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bgjp" event={"ID":"8282622f-0957-4284-9687-018b29cd9938","Type":"ContainerDied","Data":"bad19b443c67440a594c75fae5f376630e3bda9b84fc50f8988de593d928e8cc"} Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038441 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bgjp" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038461 4618 scope.go:117] "RemoveContainer" containerID="707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.038669 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities\") pod \"8282622f-0957-4284-9687-018b29cd9938\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.039481 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwrtz\" (UniqueName: \"kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz\") pod \"8282622f-0957-4284-9687-018b29cd9938\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.039576 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content\") pod \"8282622f-0957-4284-9687-018b29cd9938\" (UID: \"8282622f-0957-4284-9687-018b29cd9938\") " Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.039731 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities" (OuterVolumeSpecName: "utilities") pod "8282622f-0957-4284-9687-018b29cd9938" (UID: "8282622f-0957-4284-9687-018b29cd9938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.042272 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.045499 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz" (OuterVolumeSpecName: "kube-api-access-zwrtz") pod "8282622f-0957-4284-9687-018b29cd9938" (UID: "8282622f-0957-4284-9687-018b29cd9938"). InnerVolumeSpecName "kube-api-access-zwrtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.076569 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8282622f-0957-4284-9687-018b29cd9938" (UID: "8282622f-0957-4284-9687-018b29cd9938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.088927 4618 scope.go:117] "RemoveContainer" containerID="69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.108504 4618 scope.go:117] "RemoveContainer" containerID="7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.135717 4618 scope.go:117] "RemoveContainer" containerID="707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9" Jan 21 09:33:55 crc kubenswrapper[4618]: E0121 09:33:55.136206 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9\": container with ID starting with 707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9 not found: ID does not exist" containerID="707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.136249 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9"} err="failed to get container status \"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9\": rpc error: code = NotFound desc = could not find container \"707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9\": container with ID starting with 707cc2b438ca6253f4a993182e4ff0fce6fdc97f9a45eebd992c49a6dc867fd9 not found: ID does not exist" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.136280 4618 scope.go:117] "RemoveContainer" containerID="69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3" Jan 21 09:33:55 crc kubenswrapper[4618]: E0121 09:33:55.136563 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3\": container with ID starting with 69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3 not found: ID does not exist" containerID="69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.136595 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3"} err="failed to get container status \"69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3\": rpc error: code = NotFound desc = could not find container \"69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3\": container with ID starting with 69674b7d864fd3f13dbcc245011487213b007212081c7914766c745b2bc869c3 not found: ID does not exist" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.136617 4618 scope.go:117] "RemoveContainer" containerID="7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35" Jan 21 09:33:55 crc kubenswrapper[4618]: E0121 09:33:55.136867 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35\": container with ID starting with 7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35 not found: ID does not exist" containerID="7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.136913 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35"} err="failed to get container status \"7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35\": rpc error: code = NotFound desc = could not find container \"7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35\": container with ID starting with 7d5223d30e7e1811426ddd396898395f309f8ccb782dc966ec36475364c79c35 not found: ID does not exist" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.144499 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwrtz\" (UniqueName: \"kubernetes.io/projected/8282622f-0957-4284-9687-018b29cd9938-kube-api-access-zwrtz\") on node \"crc\" DevicePath \"\"" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.144527 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8282622f-0957-4284-9687-018b29cd9938-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.364924 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.369939 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2bgjp"] Jan 21 09:33:55 crc kubenswrapper[4618]: I0121 09:33:55.546850 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8282622f-0957-4284-9687-018b29cd9938" path="/var/lib/kubelet/pods/8282622f-0957-4284-9687-018b29cd9938/volumes" Jan 21 09:34:08 crc kubenswrapper[4618]: I0121 09:34:08.538225 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:34:08 crc kubenswrapper[4618]: E0121 09:34:08.539090 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:34:21 crc kubenswrapper[4618]: I0121 09:34:21.565831 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:34:21 crc kubenswrapper[4618]: E0121 09:34:21.566986 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:34:36 crc kubenswrapper[4618]: I0121 09:34:36.538014 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:34:36 crc kubenswrapper[4618]: E0121 09:34:36.538838 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:34:49 crc kubenswrapper[4618]: I0121 09:34:49.537834 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:34:49 crc kubenswrapper[4618]: E0121 09:34:49.538608 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:35:00 crc kubenswrapper[4618]: I0121 09:35:00.569471 4618 generic.go:334] "Generic (PLEG): container finished" podID="5ff62cc0-5880-4589-ac86-a671f9533ff4" containerID="d6df21be9b8a950b0326b2072858830bfa8d10f75dbdf6bbe800db5d36f9f4c7" exitCode=0 Jan 21 09:35:00 crc kubenswrapper[4618]: I0121 09:35:00.569583 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" event={"ID":"5ff62cc0-5880-4589-ac86-a671f9533ff4","Type":"ContainerDied","Data":"d6df21be9b8a950b0326b2072858830bfa8d10f75dbdf6bbe800db5d36f9f4c7"} Jan 21 09:35:01 crc kubenswrapper[4618]: I0121 09:35:01.930707 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.035364 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jz42\" (UniqueName: \"kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42\") pod \"5ff62cc0-5880-4589-ac86-a671f9533ff4\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.035464 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam\") pod \"5ff62cc0-5880-4589-ac86-a671f9533ff4\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.035560 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle\") pod \"5ff62cc0-5880-4589-ac86-a671f9533ff4\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.035605 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory\") pod \"5ff62cc0-5880-4589-ac86-a671f9533ff4\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.035673 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0\") pod \"5ff62cc0-5880-4589-ac86-a671f9533ff4\" (UID: \"5ff62cc0-5880-4589-ac86-a671f9533ff4\") " Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.043008 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5ff62cc0-5880-4589-ac86-a671f9533ff4" (UID: "5ff62cc0-5880-4589-ac86-a671f9533ff4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.043065 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42" (OuterVolumeSpecName: "kube-api-access-9jz42") pod "5ff62cc0-5880-4589-ac86-a671f9533ff4" (UID: "5ff62cc0-5880-4589-ac86-a671f9533ff4"). InnerVolumeSpecName "kube-api-access-9jz42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.057695 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ff62cc0-5880-4589-ac86-a671f9533ff4" (UID: "5ff62cc0-5880-4589-ac86-a671f9533ff4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.058686 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory" (OuterVolumeSpecName: "inventory") pod "5ff62cc0-5880-4589-ac86-a671f9533ff4" (UID: "5ff62cc0-5880-4589-ac86-a671f9533ff4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.059390 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5ff62cc0-5880-4589-ac86-a671f9533ff4" (UID: "5ff62cc0-5880-4589-ac86-a671f9533ff4"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.138624 4618 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.138653 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jz42\" (UniqueName: \"kubernetes.io/projected/5ff62cc0-5880-4589-ac86-a671f9533ff4-kube-api-access-9jz42\") on node \"crc\" DevicePath \"\"" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.138666 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.138677 4618 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.138689 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff62cc0-5880-4589-ac86-a671f9533ff4-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.585798 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" event={"ID":"5ff62cc0-5880-4589-ac86-a671f9533ff4","Type":"ContainerDied","Data":"ffc38dd6cb8ad1a8b7980dbf0510ac38b540fdc2d144619edd1d335e90752a64"} Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.585996 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc38dd6cb8ad1a8b7980dbf0510ac38b540fdc2d144619edd1d335e90752a64" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.585856 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.678821 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524"] Jan 21 09:35:02 crc kubenswrapper[4618]: E0121 09:35:02.679994 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="registry-server" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680016 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="registry-server" Jan 21 09:35:02 crc kubenswrapper[4618]: E0121 09:35:02.680037 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="extract-content" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680045 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="extract-content" Jan 21 09:35:02 crc kubenswrapper[4618]: E0121 09:35:02.680084 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="extract-utilities" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680092 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="extract-utilities" Jan 21 09:35:02 crc kubenswrapper[4618]: E0121 09:35:02.680121 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff62cc0-5880-4589-ac86-a671f9533ff4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680130 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff62cc0-5880-4589-ac86-a671f9533ff4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680681 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff62cc0-5880-4589-ac86-a671f9533ff4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.680732 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="8282622f-0957-4284-9687-018b29cd9938" containerName="registry-server" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.682928 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.686974 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687178 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687369 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687492 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687563 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687566 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.687562 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.694823 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524"] Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749331 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6mx\" (UniqueName: \"kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749499 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749522 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749546 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749577 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749774 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749843 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.749907 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.750015 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.851855 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852271 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852438 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852551 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852755 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6mx\" (UniqueName: \"kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.852979 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.853083 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.853182 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.853935 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.858527 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.858636 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.858983 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.859040 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.859492 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.859784 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.861092 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:02 crc kubenswrapper[4618]: I0121 09:35:02.868026 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6mx\" (UniqueName: \"kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-r4524\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:03 crc kubenswrapper[4618]: I0121 09:35:03.007500 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:35:03 crc kubenswrapper[4618]: I0121 09:35:03.488797 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524"] Jan 21 09:35:03 crc kubenswrapper[4618]: I0121 09:35:03.596376 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" event={"ID":"d826d9d4-6108-4f59-9c79-313f8f3b3d19","Type":"ContainerStarted","Data":"30667fa5a4c43a2361272a594140e9d5577d737c3cacb1987a3c400753c7c061"} Jan 21 09:35:04 crc kubenswrapper[4618]: I0121 09:35:04.538834 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:35:04 crc kubenswrapper[4618]: E0121 09:35:04.540244 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:35:04 crc kubenswrapper[4618]: I0121 09:35:04.622966 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" event={"ID":"d826d9d4-6108-4f59-9c79-313f8f3b3d19","Type":"ContainerStarted","Data":"1c3a5b1a8a9beb64bbe4247e63e22bbebd4bf457cf30f1a4853f55005564482c"} Jan 21 09:35:04 crc kubenswrapper[4618]: I0121 09:35:04.648022 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" podStartSLOduration=2.072716894 podStartE2EDuration="2.648003452s" podCreationTimestamp="2026-01-21 09:35:02 +0000 UTC" firstStartedPulling="2026-01-21 09:35:03.491548121 +0000 UTC m=+1902.242015437" lastFinishedPulling="2026-01-21 09:35:04.066834678 +0000 UTC m=+1902.817301995" observedRunningTime="2026-01-21 09:35:04.63951704 +0000 UTC m=+1903.389984357" watchObservedRunningTime="2026-01-21 09:35:04.648003452 +0000 UTC m=+1903.398470769" Jan 21 09:35:19 crc kubenswrapper[4618]: I0121 09:35:19.537689 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:35:19 crc kubenswrapper[4618]: E0121 09:35:19.538512 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:35:31 crc kubenswrapper[4618]: I0121 09:35:31.542107 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:35:31 crc kubenswrapper[4618]: E0121 09:35:31.542981 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:35:44 crc kubenswrapper[4618]: I0121 09:35:44.538374 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:35:44 crc kubenswrapper[4618]: E0121 09:35:44.539481 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:35:58 crc kubenswrapper[4618]: I0121 09:35:58.538020 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:35:58 crc kubenswrapper[4618]: E0121 09:35:58.538953 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:36:11 crc kubenswrapper[4618]: I0121 09:36:11.542602 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:36:11 crc kubenswrapper[4618]: E0121 09:36:11.543494 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:36:23 crc kubenswrapper[4618]: I0121 09:36:23.537887 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:36:23 crc kubenswrapper[4618]: E0121 09:36:23.538860 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:36:35 crc kubenswrapper[4618]: I0121 09:36:35.537817 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:36:36 crc kubenswrapper[4618]: I0121 09:36:36.325401 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea"} Jan 21 09:36:39 crc kubenswrapper[4618]: I0121 09:36:39.354064 4618 generic.go:334] "Generic (PLEG): container finished" podID="d826d9d4-6108-4f59-9c79-313f8f3b3d19" containerID="1c3a5b1a8a9beb64bbe4247e63e22bbebd4bf457cf30f1a4853f55005564482c" exitCode=0 Jan 21 09:36:39 crc kubenswrapper[4618]: I0121 09:36:39.354198 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" event={"ID":"d826d9d4-6108-4f59-9c79-313f8f3b3d19","Type":"ContainerDied","Data":"1c3a5b1a8a9beb64bbe4247e63e22bbebd4bf457cf30f1a4853f55005564482c"} Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.701756 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748350 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748403 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748552 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748566 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748590 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748652 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6mx\" (UniqueName: \"kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748683 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748721 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.748748 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0\") pod \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\" (UID: \"d826d9d4-6108-4f59-9c79-313f8f3b3d19\") " Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.766396 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.766405 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx" (OuterVolumeSpecName: "kube-api-access-nw6mx") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "kube-api-access-nw6mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.772795 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.774196 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.774499 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.776034 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.777563 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.780749 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.800395 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory" (OuterVolumeSpecName: "inventory") pod "d826d9d4-6108-4f59-9c79-313f8f3b3d19" (UID: "d826d9d4-6108-4f59-9c79-313f8f3b3d19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.850465 4618 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.850558 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.850682 4618 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.850778 4618 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.850889 4618 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.851004 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6mx\" (UniqueName: \"kubernetes.io/projected/d826d9d4-6108-4f59-9c79-313f8f3b3d19-kube-api-access-nw6mx\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.851068 4618 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.851157 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:40 crc kubenswrapper[4618]: I0121 09:36:40.851560 4618 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d826d9d4-6108-4f59-9c79-313f8f3b3d19-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.377116 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" event={"ID":"d826d9d4-6108-4f59-9c79-313f8f3b3d19","Type":"ContainerDied","Data":"30667fa5a4c43a2361272a594140e9d5577d737c3cacb1987a3c400753c7c061"} Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.377571 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30667fa5a4c43a2361272a594140e9d5577d737c3cacb1987a3c400753c7c061" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.377198 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-r4524" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.455948 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn"] Jan 21 09:36:41 crc kubenswrapper[4618]: E0121 09:36:41.456456 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d826d9d4-6108-4f59-9c79-313f8f3b3d19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.456479 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d826d9d4-6108-4f59-9c79-313f8f3b3d19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.456692 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d826d9d4-6108-4f59-9c79-313f8f3b3d19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.457722 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.459240 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.460162 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.461463 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.461591 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.461724 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-nfd9f" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.474452 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn"] Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565096 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565260 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqc8m\" (UniqueName: \"kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565521 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565592 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565735 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565777 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.565817 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667237 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667286 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667325 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667459 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667491 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqc8m\" (UniqueName: \"kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667592 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.667627 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.672025 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.672372 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.673003 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.673201 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.681691 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.682035 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.682793 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqc8m\" (UniqueName: \"kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:41 crc kubenswrapper[4618]: I0121 09:36:41.784825 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:36:42 crc kubenswrapper[4618]: I0121 09:36:42.230404 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn"] Jan 21 09:36:42 crc kubenswrapper[4618]: I0121 09:36:42.386714 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" event={"ID":"4d8904ba-60fd-453f-884f-6fe7003c205f","Type":"ContainerStarted","Data":"9cbefc87aa6a155c9a1ab81e1bf02d6f63d3f0fdad2071adea73d724524fbe27"} Jan 21 09:36:43 crc kubenswrapper[4618]: I0121 09:36:43.395766 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" event={"ID":"4d8904ba-60fd-453f-884f-6fe7003c205f","Type":"ContainerStarted","Data":"9fa56a584164f9b3e7446f9fd29e88929cd1e0f6a3e6f110d6fb93c112239905"} Jan 21 09:36:43 crc kubenswrapper[4618]: I0121 09:36:43.419150 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" podStartSLOduration=1.884618932 podStartE2EDuration="2.419117894s" podCreationTimestamp="2026-01-21 09:36:41 +0000 UTC" firstStartedPulling="2026-01-21 09:36:42.238861476 +0000 UTC m=+2000.989328793" lastFinishedPulling="2026-01-21 09:36:42.773360438 +0000 UTC m=+2001.523827755" observedRunningTime="2026-01-21 09:36:43.407007932 +0000 UTC m=+2002.157475248" watchObservedRunningTime="2026-01-21 09:36:43.419117894 +0000 UTC m=+2002.169585210" Jan 21 09:38:26 crc kubenswrapper[4618]: I0121 09:38:26.179459 4618 generic.go:334] "Generic (PLEG): container finished" podID="4d8904ba-60fd-453f-884f-6fe7003c205f" containerID="9fa56a584164f9b3e7446f9fd29e88929cd1e0f6a3e6f110d6fb93c112239905" exitCode=0 Jan 21 09:38:26 crc kubenswrapper[4618]: I0121 09:38:26.179551 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" event={"ID":"4d8904ba-60fd-453f-884f-6fe7003c205f","Type":"ContainerDied","Data":"9fa56a584164f9b3e7446f9fd29e88929cd1e0f6a3e6f110d6fb93c112239905"} Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.498622 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.536777 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.536884 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqc8m\" (UniqueName: \"kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.537012 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.537048 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.537082 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.537391 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.537454 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory\") pod \"4d8904ba-60fd-453f-884f-6fe7003c205f\" (UID: \"4d8904ba-60fd-453f-884f-6fe7003c205f\") " Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.542872 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.551727 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m" (OuterVolumeSpecName: "kube-api-access-sqc8m") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "kube-api-access-sqc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.560502 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.562339 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.563008 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.568087 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory" (OuterVolumeSpecName: "inventory") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.568516 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4d8904ba-60fd-453f-884f-6fe7003c205f" (UID: "4d8904ba-60fd-453f-884f-6fe7003c205f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.639407 4618 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.639523 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.640037 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.640100 4618 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.640119 4618 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.640132 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d8904ba-60fd-453f-884f-6fe7003c205f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:27 crc kubenswrapper[4618]: I0121 09:38:27.640161 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqc8m\" (UniqueName: \"kubernetes.io/projected/4d8904ba-60fd-453f-884f-6fe7003c205f-kube-api-access-sqc8m\") on node \"crc\" DevicePath \"\"" Jan 21 09:38:28 crc kubenswrapper[4618]: I0121 09:38:28.194482 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" event={"ID":"4d8904ba-60fd-453f-884f-6fe7003c205f","Type":"ContainerDied","Data":"9cbefc87aa6a155c9a1ab81e1bf02d6f63d3f0fdad2071adea73d724524fbe27"} Jan 21 09:38:28 crc kubenswrapper[4618]: I0121 09:38:28.194513 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn" Jan 21 09:38:28 crc kubenswrapper[4618]: I0121 09:38:28.194530 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cbefc87aa6a155c9a1ab81e1bf02d6f63d3f0fdad2071adea73d724524fbe27" Jan 21 09:38:53 crc kubenswrapper[4618]: E0121 09:38:53.497397 4618 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.98:44150->192.168.25.98:41395: read tcp 192.168.25.98:44150->192.168.25.98:41395: read: connection reset by peer Jan 21 09:38:56 crc kubenswrapper[4618]: I0121 09:38:56.958531 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:38:56 crc kubenswrapper[4618]: I0121 09:38:56.959240 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.207459 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 09:39:01 crc kubenswrapper[4618]: E0121 09:39:01.208456 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8904ba-60fd-453f-884f-6fe7003c205f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.208472 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8904ba-60fd-453f-884f-6fe7003c205f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.208645 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8904ba-60fd-453f-884f-6fe7003c205f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.209266 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.212548 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.212733 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.212804 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wffvd" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.212902 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.214612 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.283637 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.283780 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.283837 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.385946 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386004 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386040 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s449c\" (UniqueName: \"kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386064 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386131 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386197 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386260 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386294 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.386468 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.387165 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.387392 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.394811 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.488672 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.488740 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s449c\" (UniqueName: \"kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.488809 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.488849 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.488934 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.489057 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.489587 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.489881 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.490097 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.492452 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.492483 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.503826 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s449c\" (UniqueName: \"kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.516600 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.535544 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.948493 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 21 09:39:01 crc kubenswrapper[4618]: I0121 09:39:01.954399 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:39:02 crc kubenswrapper[4618]: I0121 09:39:02.440769 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17e85cf8-1423-4fd8-a5c0-367c58482277","Type":"ContainerStarted","Data":"ad26215cc70cf0c71915a615235348ddb60ac8fa5d322c4f5efe29ad5ef7e03c"} Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.634744 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.637657 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.642436 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.834657 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.834838 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtqk\" (UniqueName: \"kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.834996 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.936782 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtqk\" (UniqueName: \"kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.936866 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.936970 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.938050 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.939133 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.957891 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtqk\" (UniqueName: \"kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk\") pod \"redhat-operators-bp8tb\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:07 crc kubenswrapper[4618]: I0121 09:39:07.958320 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:08 crc kubenswrapper[4618]: I0121 09:39:08.681325 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:09 crc kubenswrapper[4618]: I0121 09:39:09.521178 4618 generic.go:334] "Generic (PLEG): container finished" podID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerID="0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea" exitCode=0 Jan 21 09:39:09 crc kubenswrapper[4618]: I0121 09:39:09.521540 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerDied","Data":"0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea"} Jan 21 09:39:09 crc kubenswrapper[4618]: I0121 09:39:09.521635 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerStarted","Data":"14dc6a1e57ad4e1532660bffc97c5efa78fe97fac111142170a3146e5fe33071"} Jan 21 09:39:10 crc kubenswrapper[4618]: I0121 09:39:10.543606 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerStarted","Data":"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24"} Jan 21 09:39:12 crc kubenswrapper[4618]: I0121 09:39:12.558888 4618 generic.go:334] "Generic (PLEG): container finished" podID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerID="a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24" exitCode=0 Jan 21 09:39:12 crc kubenswrapper[4618]: I0121 09:39:12.559197 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerDied","Data":"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24"} Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.361943 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.364439 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.368487 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.539365 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.539405 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.539478 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l54t\" (UniqueName: \"kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.641375 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l54t\" (UniqueName: \"kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.641527 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.641553 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.642043 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.642850 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.658932 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l54t\" (UniqueName: \"kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t\") pod \"redhat-marketplace-mdtx4\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:16 crc kubenswrapper[4618]: I0121 09:39:16.682659 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:26 crc kubenswrapper[4618]: I0121 09:39:26.958594 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:39:26 crc kubenswrapper[4618]: I0121 09:39:26.959320 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:39:29 crc kubenswrapper[4618]: E0121 09:39:29.143026 4618 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 21 09:39:29 crc kubenswrapper[4618]: E0121 09:39:29.143462 4618 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s449c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(17e85cf8-1423-4fd8-a5c0-367c58482277): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 09:39:29 crc kubenswrapper[4618]: E0121 09:39:29.144710 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="17e85cf8-1423-4fd8-a5c0-367c58482277" Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.532235 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.719824 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerStarted","Data":"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41"} Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.721985 4618 generic.go:334] "Generic (PLEG): container finished" podID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerID="503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2" exitCode=0 Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.723200 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerDied","Data":"503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2"} Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.723260 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerStarted","Data":"82844d21f40f9f935cf452d26510f581493fb0d1b21d83b59898af88f4556ba5"} Jan 21 09:39:29 crc kubenswrapper[4618]: E0121 09:39:29.723376 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="17e85cf8-1423-4fd8-a5c0-367c58482277" Jan 21 09:39:29 crc kubenswrapper[4618]: I0121 09:39:29.747637 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bp8tb" podStartSLOduration=3.179252475 podStartE2EDuration="22.747609056s" podCreationTimestamp="2026-01-21 09:39:07 +0000 UTC" firstStartedPulling="2026-01-21 09:39:09.522849557 +0000 UTC m=+2148.273316864" lastFinishedPulling="2026-01-21 09:39:29.091206128 +0000 UTC m=+2167.841673445" observedRunningTime="2026-01-21 09:39:29.74035574 +0000 UTC m=+2168.490823056" watchObservedRunningTime="2026-01-21 09:39:29.747609056 +0000 UTC m=+2168.498076373" Jan 21 09:39:30 crc kubenswrapper[4618]: I0121 09:39:30.732899 4618 generic.go:334] "Generic (PLEG): container finished" podID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerID="58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a" exitCode=0 Jan 21 09:39:30 crc kubenswrapper[4618]: I0121 09:39:30.732978 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerDied","Data":"58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a"} Jan 21 09:39:31 crc kubenswrapper[4618]: I0121 09:39:31.748919 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerStarted","Data":"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27"} Jan 21 09:39:31 crc kubenswrapper[4618]: I0121 09:39:31.769325 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdtx4" podStartSLOduration=14.282560366 podStartE2EDuration="15.769309581s" podCreationTimestamp="2026-01-21 09:39:16 +0000 UTC" firstStartedPulling="2026-01-21 09:39:29.723805073 +0000 UTC m=+2168.474272390" lastFinishedPulling="2026-01-21 09:39:31.210554289 +0000 UTC m=+2169.961021605" observedRunningTime="2026-01-21 09:39:31.764910871 +0000 UTC m=+2170.515378189" watchObservedRunningTime="2026-01-21 09:39:31.769309581 +0000 UTC m=+2170.519776899" Jan 21 09:39:36 crc kubenswrapper[4618]: I0121 09:39:36.683972 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:36 crc kubenswrapper[4618]: I0121 09:39:36.684787 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:36 crc kubenswrapper[4618]: I0121 09:39:36.724023 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:36 crc kubenswrapper[4618]: I0121 09:39:36.834613 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:36 crc kubenswrapper[4618]: I0121 09:39:36.966946 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:37 crc kubenswrapper[4618]: I0121 09:39:37.958746 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:37 crc kubenswrapper[4618]: I0121 09:39:37.958995 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:38 crc kubenswrapper[4618]: I0121 09:39:38.816094 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdtx4" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="registry-server" containerID="cri-o://1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27" gracePeriod=2 Jan 21 09:39:38 crc kubenswrapper[4618]: I0121 09:39:38.993808 4618 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bp8tb" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="registry-server" probeResult="failure" output=< Jan 21 09:39:38 crc kubenswrapper[4618]: timeout: failed to connect service ":50051" within 1s Jan 21 09:39:38 crc kubenswrapper[4618]: > Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.301711 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.480299 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l54t\" (UniqueName: \"kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t\") pod \"5f8b17ef-70ce-41b7-9e4a-889835c99295\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.480448 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities\") pod \"5f8b17ef-70ce-41b7-9e4a-889835c99295\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.480956 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content\") pod \"5f8b17ef-70ce-41b7-9e4a-889835c99295\" (UID: \"5f8b17ef-70ce-41b7-9e4a-889835c99295\") " Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.481530 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities" (OuterVolumeSpecName: "utilities") pod "5f8b17ef-70ce-41b7-9e4a-889835c99295" (UID: "5f8b17ef-70ce-41b7-9e4a-889835c99295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.482288 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.489946 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t" (OuterVolumeSpecName: "kube-api-access-4l54t") pod "5f8b17ef-70ce-41b7-9e4a-889835c99295" (UID: "5f8b17ef-70ce-41b7-9e4a-889835c99295"). InnerVolumeSpecName "kube-api-access-4l54t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.506171 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f8b17ef-70ce-41b7-9e4a-889835c99295" (UID: "5f8b17ef-70ce-41b7-9e4a-889835c99295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.583990 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8b17ef-70ce-41b7-9e4a-889835c99295-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.584020 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l54t\" (UniqueName: \"kubernetes.io/projected/5f8b17ef-70ce-41b7-9e4a-889835c99295-kube-api-access-4l54t\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.827517 4618 generic.go:334] "Generic (PLEG): container finished" podID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerID="1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27" exitCode=0 Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.827571 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerDied","Data":"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27"} Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.827594 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdtx4" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.827612 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdtx4" event={"ID":"5f8b17ef-70ce-41b7-9e4a-889835c99295","Type":"ContainerDied","Data":"82844d21f40f9f935cf452d26510f581493fb0d1b21d83b59898af88f4556ba5"} Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.827639 4618 scope.go:117] "RemoveContainer" containerID="1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.852520 4618 scope.go:117] "RemoveContainer" containerID="58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.853241 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.860021 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdtx4"] Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.872875 4618 scope.go:117] "RemoveContainer" containerID="503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.904128 4618 scope.go:117] "RemoveContainer" containerID="1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27" Jan 21 09:39:39 crc kubenswrapper[4618]: E0121 09:39:39.904509 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27\": container with ID starting with 1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27 not found: ID does not exist" containerID="1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.904548 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27"} err="failed to get container status \"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27\": rpc error: code = NotFound desc = could not find container \"1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27\": container with ID starting with 1c2377d641c37aea86cf1075dff8231e8b1e60e9bf4f80fbd0535ebaedba5f27 not found: ID does not exist" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.904574 4618 scope.go:117] "RemoveContainer" containerID="58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a" Jan 21 09:39:39 crc kubenswrapper[4618]: E0121 09:39:39.905015 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a\": container with ID starting with 58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a not found: ID does not exist" containerID="58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.905069 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a"} err="failed to get container status \"58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a\": rpc error: code = NotFound desc = could not find container \"58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a\": container with ID starting with 58850c90e66d236b7f8350d9fd95b2068483e02c967d2a2da30dbd1455b1dc8a not found: ID does not exist" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.905102 4618 scope.go:117] "RemoveContainer" containerID="503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2" Jan 21 09:39:39 crc kubenswrapper[4618]: E0121 09:39:39.905570 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2\": container with ID starting with 503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2 not found: ID does not exist" containerID="503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2" Jan 21 09:39:39 crc kubenswrapper[4618]: I0121 09:39:39.905598 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2"} err="failed to get container status \"503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2\": rpc error: code = NotFound desc = could not find container \"503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2\": container with ID starting with 503dd3eb022c3de6ab1b990093a4f5f339ee5549ae4daa3c14b657692b3bf8f2 not found: ID does not exist" Jan 21 09:39:41 crc kubenswrapper[4618]: I0121 09:39:41.548477 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" path="/var/lib/kubelet/pods/5f8b17ef-70ce-41b7-9e4a-889835c99295/volumes" Jan 21 09:39:42 crc kubenswrapper[4618]: I0121 09:39:42.036285 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 21 09:39:43 crc kubenswrapper[4618]: I0121 09:39:43.867184 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17e85cf8-1423-4fd8-a5c0-367c58482277","Type":"ContainerStarted","Data":"cd559c537c1b227b46aba08c5e2ff87bf706cd47a6afe0bf574f888c04d1f6d3"} Jan 21 09:39:43 crc kubenswrapper[4618]: I0121 09:39:43.890645 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.810599466 podStartE2EDuration="43.89062984s" podCreationTimestamp="2026-01-21 09:39:00 +0000 UTC" firstStartedPulling="2026-01-21 09:39:01.954021437 +0000 UTC m=+2140.704488754" lastFinishedPulling="2026-01-21 09:39:42.034051821 +0000 UTC m=+2180.784519128" observedRunningTime="2026-01-21 09:39:43.887615253 +0000 UTC m=+2182.638082570" watchObservedRunningTime="2026-01-21 09:39:43.89062984 +0000 UTC m=+2182.641097158" Jan 21 09:39:47 crc kubenswrapper[4618]: I0121 09:39:47.994347 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:48 crc kubenswrapper[4618]: I0121 09:39:48.029974 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:48 crc kubenswrapper[4618]: I0121 09:39:48.225629 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:49 crc kubenswrapper[4618]: I0121 09:39:49.948023 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bp8tb" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="registry-server" containerID="cri-o://ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41" gracePeriod=2 Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.392668 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.557812 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content\") pod \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.557981 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities\") pod \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.558160 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtqk\" (UniqueName: \"kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk\") pod \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\" (UID: \"da2f310e-ba02-4ccf-8db2-0653b6b7110c\") " Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.558505 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities" (OuterVolumeSpecName: "utilities") pod "da2f310e-ba02-4ccf-8db2-0653b6b7110c" (UID: "da2f310e-ba02-4ccf-8db2-0653b6b7110c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.558785 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.564935 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk" (OuterVolumeSpecName: "kube-api-access-7jtqk") pod "da2f310e-ba02-4ccf-8db2-0653b6b7110c" (UID: "da2f310e-ba02-4ccf-8db2-0653b6b7110c"). InnerVolumeSpecName "kube-api-access-7jtqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.654108 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da2f310e-ba02-4ccf-8db2-0653b6b7110c" (UID: "da2f310e-ba02-4ccf-8db2-0653b6b7110c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.661595 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtqk\" (UniqueName: \"kubernetes.io/projected/da2f310e-ba02-4ccf-8db2-0653b6b7110c-kube-api-access-7jtqk\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.661641 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2f310e-ba02-4ccf-8db2-0653b6b7110c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.963300 4618 generic.go:334] "Generic (PLEG): container finished" podID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerID="ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41" exitCode=0 Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.963394 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerDied","Data":"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41"} Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.963453 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp8tb" event={"ID":"da2f310e-ba02-4ccf-8db2-0653b6b7110c","Type":"ContainerDied","Data":"14dc6a1e57ad4e1532660bffc97c5efa78fe97fac111142170a3146e5fe33071"} Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.963476 4618 scope.go:117] "RemoveContainer" containerID="ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.964303 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp8tb" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.989579 4618 scope.go:117] "RemoveContainer" containerID="a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24" Jan 21 09:39:50 crc kubenswrapper[4618]: I0121 09:39:50.995297 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.005264 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bp8tb"] Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.025508 4618 scope.go:117] "RemoveContainer" containerID="0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.061590 4618 scope.go:117] "RemoveContainer" containerID="ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41" Jan 21 09:39:51 crc kubenswrapper[4618]: E0121 09:39:51.062012 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41\": container with ID starting with ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41 not found: ID does not exist" containerID="ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.062049 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41"} err="failed to get container status \"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41\": rpc error: code = NotFound desc = could not find container \"ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41\": container with ID starting with ec49a358b42d29508b03e561f99ed292e33f65728aa99150085a7dfa485d0d41 not found: ID does not exist" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.062076 4618 scope.go:117] "RemoveContainer" containerID="a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24" Jan 21 09:39:51 crc kubenswrapper[4618]: E0121 09:39:51.062634 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24\": container with ID starting with a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24 not found: ID does not exist" containerID="a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.062659 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24"} err="failed to get container status \"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24\": rpc error: code = NotFound desc = could not find container \"a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24\": container with ID starting with a49dd33ecd5f804babefa1636008b826c285632cdc91ef057d4ff3502e169b24 not found: ID does not exist" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.062673 4618 scope.go:117] "RemoveContainer" containerID="0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea" Jan 21 09:39:51 crc kubenswrapper[4618]: E0121 09:39:51.063007 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea\": container with ID starting with 0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea not found: ID does not exist" containerID="0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.063028 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea"} err="failed to get container status \"0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea\": rpc error: code = NotFound desc = could not find container \"0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea\": container with ID starting with 0d4a624f1c7df6fd1e19fee1e3efe512a78b7ac9af58775d8f7ce77a5cbd77ea not found: ID does not exist" Jan 21 09:39:51 crc kubenswrapper[4618]: I0121 09:39:51.548707 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" path="/var/lib/kubelet/pods/da2f310e-ba02-4ccf-8db2-0653b6b7110c/volumes" Jan 21 09:39:56 crc kubenswrapper[4618]: I0121 09:39:56.959251 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:39:56 crc kubenswrapper[4618]: I0121 09:39:56.959884 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:39:56 crc kubenswrapper[4618]: I0121 09:39:56.959930 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:39:56 crc kubenswrapper[4618]: I0121 09:39:56.960399 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:39:56 crc kubenswrapper[4618]: I0121 09:39:56.960449 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea" gracePeriod=600 Jan 21 09:39:58 crc kubenswrapper[4618]: I0121 09:39:58.033244 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea" exitCode=0 Jan 21 09:39:58 crc kubenswrapper[4618]: I0121 09:39:58.033323 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea"} Jan 21 09:39:58 crc kubenswrapper[4618]: I0121 09:39:58.033928 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e"} Jan 21 09:39:58 crc kubenswrapper[4618]: I0121 09:39:58.033967 4618 scope.go:117] "RemoveContainer" containerID="f69dd22c288555725f29cbc66c561c8bedcc5b802e4eb79a572830de5072345c" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.418095 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.418926 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="extract-utilities" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.418938 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="extract-utilities" Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.418956 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.418961 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.418975 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="extract-content" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.418980 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="extract-content" Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.418994 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.418999 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.419020 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="extract-utilities" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.419025 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="extract-utilities" Jan 21 09:41:12 crc kubenswrapper[4618]: E0121 09:41:12.419033 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="extract-content" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.419397 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="extract-content" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.419566 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8b17ef-70ce-41b7-9e4a-889835c99295" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.419590 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2f310e-ba02-4ccf-8db2-0653b6b7110c" containerName="registry-server" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.420819 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.427504 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.510544 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.510705 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddsf2\" (UniqueName: \"kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.510904 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.613502 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.613775 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.613907 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsf2\" (UniqueName: \"kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.614058 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.614382 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.634361 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddsf2\" (UniqueName: \"kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2\") pod \"community-operators-twsn7\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:12 crc kubenswrapper[4618]: I0121 09:41:12.742772 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:13 crc kubenswrapper[4618]: I0121 09:41:13.170056 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:13 crc kubenswrapper[4618]: I0121 09:41:13.678427 4618 generic.go:334] "Generic (PLEG): container finished" podID="137397cf-f079-43b7-95e3-e44639cc58c5" containerID="8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5" exitCode=0 Jan 21 09:41:13 crc kubenswrapper[4618]: I0121 09:41:13.678493 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerDied","Data":"8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5"} Jan 21 09:41:13 crc kubenswrapper[4618]: I0121 09:41:13.678708 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerStarted","Data":"5dfc954e27b7254d05e6cf80f51f9211fd2310b72d8fecab00ead235b04f007a"} Jan 21 09:41:14 crc kubenswrapper[4618]: I0121 09:41:14.689959 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerStarted","Data":"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26"} Jan 21 09:41:15 crc kubenswrapper[4618]: I0121 09:41:15.699351 4618 generic.go:334] "Generic (PLEG): container finished" podID="137397cf-f079-43b7-95e3-e44639cc58c5" containerID="2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26" exitCode=0 Jan 21 09:41:15 crc kubenswrapper[4618]: I0121 09:41:15.699439 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerDied","Data":"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26"} Jan 21 09:41:16 crc kubenswrapper[4618]: I0121 09:41:16.709320 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerStarted","Data":"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59"} Jan 21 09:41:16 crc kubenswrapper[4618]: I0121 09:41:16.724616 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twsn7" podStartSLOduration=2.20946537 podStartE2EDuration="4.724603804s" podCreationTimestamp="2026-01-21 09:41:12 +0000 UTC" firstStartedPulling="2026-01-21 09:41:13.680575648 +0000 UTC m=+2272.431042955" lastFinishedPulling="2026-01-21 09:41:16.195714072 +0000 UTC m=+2274.946181389" observedRunningTime="2026-01-21 09:41:16.721496082 +0000 UTC m=+2275.471963399" watchObservedRunningTime="2026-01-21 09:41:16.724603804 +0000 UTC m=+2275.475071122" Jan 21 09:41:22 crc kubenswrapper[4618]: I0121 09:41:22.743573 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:22 crc kubenswrapper[4618]: I0121 09:41:22.744117 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:22 crc kubenswrapper[4618]: I0121 09:41:22.787064 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:22 crc kubenswrapper[4618]: I0121 09:41:22.823970 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:23 crc kubenswrapper[4618]: I0121 09:41:23.018548 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:24 crc kubenswrapper[4618]: I0121 09:41:24.788299 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twsn7" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="registry-server" containerID="cri-o://4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59" gracePeriod=2 Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.244426 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.278841 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddsf2\" (UniqueName: \"kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2\") pod \"137397cf-f079-43b7-95e3-e44639cc58c5\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.278916 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities\") pod \"137397cf-f079-43b7-95e3-e44639cc58c5\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.279000 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content\") pod \"137397cf-f079-43b7-95e3-e44639cc58c5\" (UID: \"137397cf-f079-43b7-95e3-e44639cc58c5\") " Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.280304 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities" (OuterVolumeSpecName: "utilities") pod "137397cf-f079-43b7-95e3-e44639cc58c5" (UID: "137397cf-f079-43b7-95e3-e44639cc58c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.287237 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2" (OuterVolumeSpecName: "kube-api-access-ddsf2") pod "137397cf-f079-43b7-95e3-e44639cc58c5" (UID: "137397cf-f079-43b7-95e3-e44639cc58c5"). InnerVolumeSpecName "kube-api-access-ddsf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.323497 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "137397cf-f079-43b7-95e3-e44639cc58c5" (UID: "137397cf-f079-43b7-95e3-e44639cc58c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.380129 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddsf2\" (UniqueName: \"kubernetes.io/projected/137397cf-f079-43b7-95e3-e44639cc58c5-kube-api-access-ddsf2\") on node \"crc\" DevicePath \"\"" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.380187 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.380199 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137397cf-f079-43b7-95e3-e44639cc58c5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.801377 4618 generic.go:334] "Generic (PLEG): container finished" podID="137397cf-f079-43b7-95e3-e44639cc58c5" containerID="4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59" exitCode=0 Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.801434 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerDied","Data":"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59"} Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.801477 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twsn7" event={"ID":"137397cf-f079-43b7-95e3-e44639cc58c5","Type":"ContainerDied","Data":"5dfc954e27b7254d05e6cf80f51f9211fd2310b72d8fecab00ead235b04f007a"} Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.801497 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twsn7" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.801513 4618 scope.go:117] "RemoveContainer" containerID="4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.823318 4618 scope.go:117] "RemoveContainer" containerID="2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.832698 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.842087 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twsn7"] Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.844137 4618 scope.go:117] "RemoveContainer" containerID="8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.884284 4618 scope.go:117] "RemoveContainer" containerID="4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59" Jan 21 09:41:25 crc kubenswrapper[4618]: E0121 09:41:25.884710 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59\": container with ID starting with 4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59 not found: ID does not exist" containerID="4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.884749 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59"} err="failed to get container status \"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59\": rpc error: code = NotFound desc = could not find container \"4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59\": container with ID starting with 4d21a94dfd45ac6d5d96e10bf6ee8b06745f282dca0934bcb11cc66447933c59 not found: ID does not exist" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.884775 4618 scope.go:117] "RemoveContainer" containerID="2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26" Jan 21 09:41:25 crc kubenswrapper[4618]: E0121 09:41:25.885035 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26\": container with ID starting with 2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26 not found: ID does not exist" containerID="2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.885079 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26"} err="failed to get container status \"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26\": rpc error: code = NotFound desc = could not find container \"2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26\": container with ID starting with 2cd56e6a757e2dca445a4b425655e43c91703b4d579fbacb70242743e7b3fc26 not found: ID does not exist" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.885103 4618 scope.go:117] "RemoveContainer" containerID="8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5" Jan 21 09:41:25 crc kubenswrapper[4618]: E0121 09:41:25.885625 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5\": container with ID starting with 8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5 not found: ID does not exist" containerID="8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5" Jan 21 09:41:25 crc kubenswrapper[4618]: I0121 09:41:25.885655 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5"} err="failed to get container status \"8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5\": rpc error: code = NotFound desc = could not find container \"8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5\": container with ID starting with 8e3e4b8787306a45c17579f6677dbd35186d3acef7cac38228f3f51e7f4829a5 not found: ID does not exist" Jan 21 09:41:27 crc kubenswrapper[4618]: I0121 09:41:27.547001 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" path="/var/lib/kubelet/pods/137397cf-f079-43b7-95e3-e44639cc58c5/volumes" Jan 21 09:42:26 crc kubenswrapper[4618]: I0121 09:42:26.959366 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:42:26 crc kubenswrapper[4618]: I0121 09:42:26.959806 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:42:56 crc kubenswrapper[4618]: I0121 09:42:56.959290 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:42:56 crc kubenswrapper[4618]: I0121 09:42:56.959732 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:43:26 crc kubenswrapper[4618]: I0121 09:43:26.959021 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:43:26 crc kubenswrapper[4618]: I0121 09:43:26.959582 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:43:26 crc kubenswrapper[4618]: I0121 09:43:26.959634 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:43:26 crc kubenswrapper[4618]: I0121 09:43:26.960165 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:43:26 crc kubenswrapper[4618]: I0121 09:43:26.960218 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" gracePeriod=600 Jan 21 09:43:27 crc kubenswrapper[4618]: E0121 09:43:27.078104 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:43:27 crc kubenswrapper[4618]: I0121 09:43:27.688069 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" exitCode=0 Jan 21 09:43:27 crc kubenswrapper[4618]: I0121 09:43:27.688131 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e"} Jan 21 09:43:27 crc kubenswrapper[4618]: I0121 09:43:27.688477 4618 scope.go:117] "RemoveContainer" containerID="924e31a43b2bbfc76860cd081d76cdf76941a38a9ce3969a25b5de990ef00dea" Jan 21 09:43:27 crc kubenswrapper[4618]: I0121 09:43:27.689025 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:43:27 crc kubenswrapper[4618]: E0121 09:43:27.689491 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:43:38 crc kubenswrapper[4618]: I0121 09:43:38.538175 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:43:38 crc kubenswrapper[4618]: E0121 09:43:38.538934 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:43:53 crc kubenswrapper[4618]: I0121 09:43:53.538374 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:43:53 crc kubenswrapper[4618]: E0121 09:43:53.539639 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:44:05 crc kubenswrapper[4618]: I0121 09:44:05.537801 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:44:05 crc kubenswrapper[4618]: E0121 09:44:05.538545 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:44:17 crc kubenswrapper[4618]: I0121 09:44:17.541396 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:44:17 crc kubenswrapper[4618]: E0121 09:44:17.543561 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:44:32 crc kubenswrapper[4618]: I0121 09:44:32.537911 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:44:32 crc kubenswrapper[4618]: E0121 09:44:32.538978 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:44:46 crc kubenswrapper[4618]: I0121 09:44:46.537491 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:44:46 crc kubenswrapper[4618]: E0121 09:44:46.538324 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.138443 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949"] Jan 21 09:45:00 crc kubenswrapper[4618]: E0121 09:45:00.139322 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="extract-content" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.139395 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="extract-content" Jan 21 09:45:00 crc kubenswrapper[4618]: E0121 09:45:00.139414 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="registry-server" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.139420 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="registry-server" Jan 21 09:45:00 crc kubenswrapper[4618]: E0121 09:45:00.139447 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="extract-utilities" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.139454 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="extract-utilities" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.139617 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="137397cf-f079-43b7-95e3-e44639cc58c5" containerName="registry-server" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.140183 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.142631 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.142887 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.143874 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949"] Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.145102 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.145442 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.145507 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5jzr\" (UniqueName: \"kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.246627 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.246759 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.247040 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5jzr\" (UniqueName: \"kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.247649 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.253122 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.263030 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5jzr\" (UniqueName: \"kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr\") pod \"collect-profiles-29483145-6j949\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.457217 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:00 crc kubenswrapper[4618]: I0121 09:45:00.866284 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949"] Jan 21 09:45:01 crc kubenswrapper[4618]: I0121 09:45:01.498971 4618 generic.go:334] "Generic (PLEG): container finished" podID="55446ec2-c64d-44e4-816e-2bb33bc2ac57" containerID="d40a73633e42be2b22b0b4186a8c3e05140a890dfff508add85397b52fa096ba" exitCode=0 Jan 21 09:45:01 crc kubenswrapper[4618]: I0121 09:45:01.499023 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" event={"ID":"55446ec2-c64d-44e4-816e-2bb33bc2ac57","Type":"ContainerDied","Data":"d40a73633e42be2b22b0b4186a8c3e05140a890dfff508add85397b52fa096ba"} Jan 21 09:45:01 crc kubenswrapper[4618]: I0121 09:45:01.500194 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" event={"ID":"55446ec2-c64d-44e4-816e-2bb33bc2ac57","Type":"ContainerStarted","Data":"3b5f62d83c5c3b64cc990c58aa824759df082995f1af5bd2ea8a67776b3f8209"} Jan 21 09:45:01 crc kubenswrapper[4618]: I0121 09:45:01.544029 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:45:01 crc kubenswrapper[4618]: E0121 09:45:01.544468 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:45:02 crc kubenswrapper[4618]: I0121 09:45:02.833812 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.002986 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5jzr\" (UniqueName: \"kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr\") pod \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.003321 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume\") pod \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.003454 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume\") pod \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\" (UID: \"55446ec2-c64d-44e4-816e-2bb33bc2ac57\") " Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.003837 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume" (OuterVolumeSpecName: "config-volume") pod "55446ec2-c64d-44e4-816e-2bb33bc2ac57" (UID: "55446ec2-c64d-44e4-816e-2bb33bc2ac57"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.008855 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55446ec2-c64d-44e4-816e-2bb33bc2ac57" (UID: "55446ec2-c64d-44e4-816e-2bb33bc2ac57"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.009573 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr" (OuterVolumeSpecName: "kube-api-access-v5jzr") pod "55446ec2-c64d-44e4-816e-2bb33bc2ac57" (UID: "55446ec2-c64d-44e4-816e-2bb33bc2ac57"). InnerVolumeSpecName "kube-api-access-v5jzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.105132 4618 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55446ec2-c64d-44e4-816e-2bb33bc2ac57-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.105176 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5jzr\" (UniqueName: \"kubernetes.io/projected/55446ec2-c64d-44e4-816e-2bb33bc2ac57-kube-api-access-v5jzr\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.105187 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55446ec2-c64d-44e4-816e-2bb33bc2ac57-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.515570 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" event={"ID":"55446ec2-c64d-44e4-816e-2bb33bc2ac57","Type":"ContainerDied","Data":"3b5f62d83c5c3b64cc990c58aa824759df082995f1af5bd2ea8a67776b3f8209"} Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.515623 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483145-6j949" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.515646 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5f62d83c5c3b64cc990c58aa824759df082995f1af5bd2ea8a67776b3f8209" Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.889957 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd"] Jan 21 09:45:03 crc kubenswrapper[4618]: I0121 09:45:03.896622 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483100-jx4hd"] Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.181914 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:04 crc kubenswrapper[4618]: E0121 09:45:04.182338 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55446ec2-c64d-44e4-816e-2bb33bc2ac57" containerName="collect-profiles" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.182356 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="55446ec2-c64d-44e4-816e-2bb33bc2ac57" containerName="collect-profiles" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.182554 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="55446ec2-c64d-44e4-816e-2bb33bc2ac57" containerName="collect-profiles" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.183970 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.193355 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.330905 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.331333 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.331538 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6trsj\" (UniqueName: \"kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.432935 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.433050 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.433100 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6trsj\" (UniqueName: \"kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.433819 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.434082 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.464113 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6trsj\" (UniqueName: \"kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj\") pod \"certified-operators-nm9cz\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.498889 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:04 crc kubenswrapper[4618]: I0121 09:45:04.951548 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:05 crc kubenswrapper[4618]: I0121 09:45:05.529873 4618 generic.go:334] "Generic (PLEG): container finished" podID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerID="14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3" exitCode=0 Jan 21 09:45:05 crc kubenswrapper[4618]: I0121 09:45:05.529918 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerDied","Data":"14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3"} Jan 21 09:45:05 crc kubenswrapper[4618]: I0121 09:45:05.530434 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerStarted","Data":"b43a8a6ed4381a1cfa1b3cbe8efb68806ec299fcf683b7821d73fe1f1a28aa73"} Jan 21 09:45:05 crc kubenswrapper[4618]: I0121 09:45:05.533477 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:45:05 crc kubenswrapper[4618]: I0121 09:45:05.551165 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06263ed-35a1-4532-ba9a-8521ec8a5b1d" path="/var/lib/kubelet/pods/e06263ed-35a1-4532-ba9a-8521ec8a5b1d/volumes" Jan 21 09:45:06 crc kubenswrapper[4618]: I0121 09:45:06.540689 4618 generic.go:334] "Generic (PLEG): container finished" podID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerID="b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af" exitCode=0 Jan 21 09:45:06 crc kubenswrapper[4618]: I0121 09:45:06.540774 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerDied","Data":"b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af"} Jan 21 09:45:07 crc kubenswrapper[4618]: I0121 09:45:07.549135 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerStarted","Data":"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da"} Jan 21 09:45:07 crc kubenswrapper[4618]: I0121 09:45:07.565595 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nm9cz" podStartSLOduration=1.9770638379999999 podStartE2EDuration="3.565582583s" podCreationTimestamp="2026-01-21 09:45:04 +0000 UTC" firstStartedPulling="2026-01-21 09:45:05.533274397 +0000 UTC m=+2504.283741714" lastFinishedPulling="2026-01-21 09:45:07.121793141 +0000 UTC m=+2505.872260459" observedRunningTime="2026-01-21 09:45:07.561556963 +0000 UTC m=+2506.312024280" watchObservedRunningTime="2026-01-21 09:45:07.565582583 +0000 UTC m=+2506.316049900" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.499174 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.499654 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.535221 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.538399 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:45:14 crc kubenswrapper[4618]: E0121 09:45:14.538654 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.632446 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:14 crc kubenswrapper[4618]: I0121 09:45:14.767617 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:16 crc kubenswrapper[4618]: I0121 09:45:16.612825 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nm9cz" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="registry-server" containerID="cri-o://c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da" gracePeriod=2 Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.019194 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.149884 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content\") pod \"7f9da60d-f23b-4104-ba5f-8d498377229c\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.150032 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6trsj\" (UniqueName: \"kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj\") pod \"7f9da60d-f23b-4104-ba5f-8d498377229c\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.150221 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities\") pod \"7f9da60d-f23b-4104-ba5f-8d498377229c\" (UID: \"7f9da60d-f23b-4104-ba5f-8d498377229c\") " Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.150739 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities" (OuterVolumeSpecName: "utilities") pod "7f9da60d-f23b-4104-ba5f-8d498377229c" (UID: "7f9da60d-f23b-4104-ba5f-8d498377229c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.151369 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.156495 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj" (OuterVolumeSpecName: "kube-api-access-6trsj") pod "7f9da60d-f23b-4104-ba5f-8d498377229c" (UID: "7f9da60d-f23b-4104-ba5f-8d498377229c"). InnerVolumeSpecName "kube-api-access-6trsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.183377 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f9da60d-f23b-4104-ba5f-8d498377229c" (UID: "7f9da60d-f23b-4104-ba5f-8d498377229c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.254100 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f9da60d-f23b-4104-ba5f-8d498377229c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.254129 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6trsj\" (UniqueName: \"kubernetes.io/projected/7f9da60d-f23b-4104-ba5f-8d498377229c-kube-api-access-6trsj\") on node \"crc\" DevicePath \"\"" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.621525 4618 generic.go:334] "Generic (PLEG): container finished" podID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerID="c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da" exitCode=0 Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.621570 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerDied","Data":"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da"} Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.621600 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nm9cz" event={"ID":"7f9da60d-f23b-4104-ba5f-8d498377229c","Type":"ContainerDied","Data":"b43a8a6ed4381a1cfa1b3cbe8efb68806ec299fcf683b7821d73fe1f1a28aa73"} Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.621618 4618 scope.go:117] "RemoveContainer" containerID="c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.621746 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nm9cz" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.641222 4618 scope.go:117] "RemoveContainer" containerID="b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.641537 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.647711 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nm9cz"] Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.659289 4618 scope.go:117] "RemoveContainer" containerID="14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.692633 4618 scope.go:117] "RemoveContainer" containerID="c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da" Jan 21 09:45:17 crc kubenswrapper[4618]: E0121 09:45:17.693230 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da\": container with ID starting with c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da not found: ID does not exist" containerID="c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.693271 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da"} err="failed to get container status \"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da\": rpc error: code = NotFound desc = could not find container \"c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da\": container with ID starting with c1b48c9011d881162edc2bff6d292c2f0b555a5fa78bdbcb1ae481ab21df10da not found: ID does not exist" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.693294 4618 scope.go:117] "RemoveContainer" containerID="b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af" Jan 21 09:45:17 crc kubenswrapper[4618]: E0121 09:45:17.693715 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af\": container with ID starting with b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af not found: ID does not exist" containerID="b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.693752 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af"} err="failed to get container status \"b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af\": rpc error: code = NotFound desc = could not find container \"b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af\": container with ID starting with b979a6ae75b00d83b3c2017fe6bc07e37dcf8375be811113c39e638af961b6af not found: ID does not exist" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.693779 4618 scope.go:117] "RemoveContainer" containerID="14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3" Jan 21 09:45:17 crc kubenswrapper[4618]: E0121 09:45:17.694217 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3\": container with ID starting with 14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3 not found: ID does not exist" containerID="14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3" Jan 21 09:45:17 crc kubenswrapper[4618]: I0121 09:45:17.694243 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3"} err="failed to get container status \"14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3\": rpc error: code = NotFound desc = could not find container \"14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3\": container with ID starting with 14c21f5debe969e4f6f030d88a50ed39cf0b570a70b122be113820f1b9a424b3 not found: ID does not exist" Jan 21 09:45:19 crc kubenswrapper[4618]: I0121 09:45:19.545709 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" path="/var/lib/kubelet/pods/7f9da60d-f23b-4104-ba5f-8d498377229c/volumes" Jan 21 09:45:25 crc kubenswrapper[4618]: I0121 09:45:25.537400 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:45:25 crc kubenswrapper[4618]: E0121 09:45:25.537932 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:45:29 crc kubenswrapper[4618]: I0121 09:45:29.276512 4618 scope.go:117] "RemoveContainer" containerID="fd7d181e949ef57516c19b2d6ac95a0bc112bbc4e32b560f2176adf303104b61" Jan 21 09:45:37 crc kubenswrapper[4618]: I0121 09:45:37.537250 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:45:37 crc kubenswrapper[4618]: E0121 09:45:37.538673 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:45:48 crc kubenswrapper[4618]: I0121 09:45:48.537544 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:45:48 crc kubenswrapper[4618]: E0121 09:45:48.538257 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:46:02 crc kubenswrapper[4618]: I0121 09:46:02.538175 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:46:02 crc kubenswrapper[4618]: E0121 09:46:02.539108 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:46:15 crc kubenswrapper[4618]: I0121 09:46:15.538856 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:46:15 crc kubenswrapper[4618]: E0121 09:46:15.539697 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:46:29 crc kubenswrapper[4618]: I0121 09:46:29.537734 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:46:29 crc kubenswrapper[4618]: E0121 09:46:29.538507 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:46:41 crc kubenswrapper[4618]: I0121 09:46:41.542002 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:46:41 crc kubenswrapper[4618]: E0121 09:46:41.542697 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:46:52 crc kubenswrapper[4618]: I0121 09:46:52.537231 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:46:52 crc kubenswrapper[4618]: E0121 09:46:52.537815 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:47:04 crc kubenswrapper[4618]: I0121 09:47:04.537843 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:47:04 crc kubenswrapper[4618]: E0121 09:47:04.538511 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:47:18 crc kubenswrapper[4618]: I0121 09:47:18.537708 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:47:18 crc kubenswrapper[4618]: E0121 09:47:18.538763 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:47:31 crc kubenswrapper[4618]: I0121 09:47:31.542170 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:47:31 crc kubenswrapper[4618]: E0121 09:47:31.543025 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:47:45 crc kubenswrapper[4618]: I0121 09:47:45.538947 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:47:45 crc kubenswrapper[4618]: E0121 09:47:45.540000 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:47:58 crc kubenswrapper[4618]: I0121 09:47:58.538015 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:47:58 crc kubenswrapper[4618]: E0121 09:47:58.538892 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:48:12 crc kubenswrapper[4618]: I0121 09:48:12.538243 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:48:12 crc kubenswrapper[4618]: E0121 09:48:12.539500 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:48:22 crc kubenswrapper[4618]: I0121 09:48:22.989383 4618 generic.go:334] "Generic (PLEG): container finished" podID="17e85cf8-1423-4fd8-a5c0-367c58482277" containerID="cd559c537c1b227b46aba08c5e2ff87bf706cd47a6afe0bf574f888c04d1f6d3" exitCode=0 Jan 21 09:48:22 crc kubenswrapper[4618]: I0121 09:48:22.989477 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17e85cf8-1423-4fd8-a5c0-367c58482277","Type":"ContainerDied","Data":"cd559c537c1b227b46aba08c5e2ff87bf706cd47a6afe0bf574f888c04d1f6d3"} Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.282868 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382467 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382580 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382617 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s449c\" (UniqueName: \"kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382718 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382765 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382795 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382827 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382858 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.382876 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config\") pod \"17e85cf8-1423-4fd8-a5c0-367c58482277\" (UID: \"17e85cf8-1423-4fd8-a5c0-367c58482277\") " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.383764 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.384729 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data" (OuterVolumeSpecName: "config-data") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.390523 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.390978 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.391060 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c" (OuterVolumeSpecName: "kube-api-access-s449c") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "kube-api-access-s449c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.406261 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.407382 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.408058 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.426417 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "17e85cf8-1423-4fd8-a5c0-367c58482277" (UID: "17e85cf8-1423-4fd8-a5c0-367c58482277"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487069 4618 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487120 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487137 4618 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/17e85cf8-1423-4fd8-a5c0-367c58482277-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487169 4618 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487215 4618 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487228 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487565 4618 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487581 4618 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/17e85cf8-1423-4fd8-a5c0-367c58482277-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.487593 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s449c\" (UniqueName: \"kubernetes.io/projected/17e85cf8-1423-4fd8-a5c0-367c58482277-kube-api-access-s449c\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.502889 4618 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 21 09:48:24 crc kubenswrapper[4618]: I0121 09:48:24.589127 4618 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 21 09:48:25 crc kubenswrapper[4618]: I0121 09:48:25.005738 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"17e85cf8-1423-4fd8-a5c0-367c58482277","Type":"ContainerDied","Data":"ad26215cc70cf0c71915a615235348ddb60ac8fa5d322c4f5efe29ad5ef7e03c"} Jan 21 09:48:25 crc kubenswrapper[4618]: I0121 09:48:25.005792 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad26215cc70cf0c71915a615235348ddb60ac8fa5d322c4f5efe29ad5ef7e03c" Jan 21 09:48:25 crc kubenswrapper[4618]: I0121 09:48:25.006077 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 21 09:48:27 crc kubenswrapper[4618]: I0121 09:48:27.537729 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:48:28 crc kubenswrapper[4618]: I0121 09:48:28.034773 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095"} Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.378414 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 09:48:33 crc kubenswrapper[4618]: E0121 09:48:33.379724 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="registry-server" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.379739 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="registry-server" Jan 21 09:48:33 crc kubenswrapper[4618]: E0121 09:48:33.379760 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="extract-content" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.379768 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="extract-content" Jan 21 09:48:33 crc kubenswrapper[4618]: E0121 09:48:33.379778 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="extract-utilities" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.379785 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="extract-utilities" Jan 21 09:48:33 crc kubenswrapper[4618]: E0121 09:48:33.379817 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e85cf8-1423-4fd8-a5c0-367c58482277" containerName="tempest-tests-tempest-tests-runner" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.379827 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e85cf8-1423-4fd8-a5c0-367c58482277" containerName="tempest-tests-tempest-tests-runner" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.380100 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e85cf8-1423-4fd8-a5c0-367c58482277" containerName="tempest-tests-tempest-tests-runner" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.380118 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9da60d-f23b-4104-ba5f-8d498377229c" containerName="registry-server" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.380968 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.384487 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wffvd" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.385822 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.492254 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgsv\" (UniqueName: \"kubernetes.io/projected/807b7b22-0fea-4aa1-bb39-fe47c6ed13c9-kube-api-access-8xgsv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.492305 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.595107 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgsv\" (UniqueName: \"kubernetes.io/projected/807b7b22-0fea-4aa1-bb39-fe47c6ed13c9-kube-api-access-8xgsv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.595309 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.596405 4618 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.614564 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgsv\" (UniqueName: \"kubernetes.io/projected/807b7b22-0fea-4aa1-bb39-fe47c6ed13c9-kube-api-access-8xgsv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.617272 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:33 crc kubenswrapper[4618]: I0121 09:48:33.704197 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 21 09:48:34 crc kubenswrapper[4618]: I0121 09:48:34.101217 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 21 09:48:35 crc kubenswrapper[4618]: I0121 09:48:35.107671 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9","Type":"ContainerStarted","Data":"4e19674b3791841bc2d6076b53ee63360b096d2582f6449662571c0982752706"} Jan 21 09:48:36 crc kubenswrapper[4618]: I0121 09:48:36.115964 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"807b7b22-0fea-4aa1-bb39-fe47c6ed13c9","Type":"ContainerStarted","Data":"41a44c0ad0e9965d673eff9a945ad6c4b02cf5a28ca1deee5bb4ccaf7a6085e0"} Jan 21 09:48:36 crc kubenswrapper[4618]: I0121 09:48:36.134735 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.249761207 podStartE2EDuration="3.134717811s" podCreationTimestamp="2026-01-21 09:48:33 +0000 UTC" firstStartedPulling="2026-01-21 09:48:34.106919135 +0000 UTC m=+2712.857386452" lastFinishedPulling="2026-01-21 09:48:34.991875739 +0000 UTC m=+2713.742343056" observedRunningTime="2026-01-21 09:48:36.127320793 +0000 UTC m=+2714.877788109" watchObservedRunningTime="2026-01-21 09:48:36.134717811 +0000 UTC m=+2714.885185129" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.589310 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kd9nr/must-gather-w2c2x"] Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.591553 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.593678 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kd9nr"/"default-dockercfg-5qnhh" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.594736 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kd9nr"/"openshift-service-ca.crt" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.601653 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kd9nr"/"kube-root-ca.crt" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.603327 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kd9nr/must-gather-w2c2x"] Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.663598 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27pn\" (UniqueName: \"kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.663737 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.765064 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j27pn\" (UniqueName: \"kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.765119 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.765541 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.785773 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27pn\" (UniqueName: \"kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn\") pod \"must-gather-w2c2x\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:55 crc kubenswrapper[4618]: I0121 09:48:55.910276 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:48:56 crc kubenswrapper[4618]: I0121 09:48:56.330249 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kd9nr/must-gather-w2c2x"] Jan 21 09:48:57 crc kubenswrapper[4618]: I0121 09:48:57.318328 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" event={"ID":"5bde442f-35fc-4320-a8b8-23f1f0b7a18b","Type":"ContainerStarted","Data":"e51b7310830f8bc239890c345ca5bd2639d9beb95f753a44a209cac74c0859dd"} Jan 21 09:49:03 crc kubenswrapper[4618]: I0121 09:49:03.378342 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" event={"ID":"5bde442f-35fc-4320-a8b8-23f1f0b7a18b","Type":"ContainerStarted","Data":"06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935"} Jan 21 09:49:03 crc kubenswrapper[4618]: I0121 09:49:03.379132 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" event={"ID":"5bde442f-35fc-4320-a8b8-23f1f0b7a18b","Type":"ContainerStarted","Data":"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a"} Jan 21 09:49:03 crc kubenswrapper[4618]: I0121 09:49:03.397384 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" podStartSLOduration=2.042553598 podStartE2EDuration="8.39736004s" podCreationTimestamp="2026-01-21 09:48:55 +0000 UTC" firstStartedPulling="2026-01-21 09:48:56.331777611 +0000 UTC m=+2735.082244928" lastFinishedPulling="2026-01-21 09:49:02.686584053 +0000 UTC m=+2741.437051370" observedRunningTime="2026-01-21 09:49:03.390996302 +0000 UTC m=+2742.141463620" watchObservedRunningTime="2026-01-21 09:49:03.39736004 +0000 UTC m=+2742.147827356" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.322052 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-2n9gs"] Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.323815 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.425633 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.425929 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzfzl\" (UniqueName: \"kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.529213 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzfzl\" (UniqueName: \"kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.529375 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.529550 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.553457 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzfzl\" (UniqueName: \"kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl\") pod \"crc-debug-2n9gs\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: I0121 09:49:06.641235 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:06 crc kubenswrapper[4618]: W0121 09:49:06.690094 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode499af3b_220f_44c7_96d3_0d2cc6a92eb2.slice/crio-5db3ebb50dc2510877e9b1969f1d1b5e7d15dbda7a8fa6c218712dbede54c4f0 WatchSource:0}: Error finding container 5db3ebb50dc2510877e9b1969f1d1b5e7d15dbda7a8fa6c218712dbede54c4f0: Status 404 returned error can't find the container with id 5db3ebb50dc2510877e9b1969f1d1b5e7d15dbda7a8fa6c218712dbede54c4f0 Jan 21 09:49:07 crc kubenswrapper[4618]: I0121 09:49:07.411190 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" event={"ID":"e499af3b-220f-44c7-96d3-0d2cc6a92eb2","Type":"ContainerStarted","Data":"5db3ebb50dc2510877e9b1969f1d1b5e7d15dbda7a8fa6c218712dbede54c4f0"} Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.270372 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68d7cbc6d4-mthph_add10569-0b7d-47e6-a9fc-943ff2f54fc4/barbican-api-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.281736 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68d7cbc6d4-mthph_add10569-0b7d-47e6-a9fc-943ff2f54fc4/barbican-api/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.312890 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c95ff478-nbsz8_4c1702d5-7295-4662-956a-180ac3b7c04d/barbican-keystone-listener-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.318109 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c95ff478-nbsz8_4c1702d5-7295-4662-956a-180ac3b7c04d/barbican-keystone-listener/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.329672 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df6b49cb5-9npwx_65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b/barbican-worker-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.334191 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df6b49cb5-9npwx_65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b/barbican-worker/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.368044 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn_194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.402295 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/ceilometer-central-agent/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.423837 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/ceilometer-notification-agent/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.428910 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/sg-core/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.437329 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/proxy-httpd/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.451297 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3c710262-7141-4edf-8f70-b5ee3d235970/cinder-api-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.472035 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3c710262-7141-4edf-8f70-b5ee3d235970/cinder-api/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.500104 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_def78b06-bd3c-4722-82a7-15b80abe36fe/cinder-scheduler/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.514482 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_def78b06-bd3c-4722-82a7-15b80abe36fe/probe/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.537425 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-44vvj_1b8522ab-9a18-468c-a001-27aa7228e059/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.557614 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9_3d61b58a-5231-47ee-8d01-2eb51a1def0c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.598504 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7b79b84c-h9xrw_9f529ba0-9024-4b63-8d19-bb798710ce6f/dnsmasq-dns/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.603720 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7b79b84c-h9xrw_9f529ba0-9024-4b63-8d19-bb798710ce6f/init/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.630680 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2_9fab9896-c90d-47af-9a73-4cf53b19d631/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.642294 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f8cbddef-d1fd-490f-b499-3a9d2e570bce/glance-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.667591 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f8cbddef-d1fd-490f-b499-3a9d2e570bce/glance-httpd/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.680261 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c616426c-57a8-42a0-8dde-7ef7f56caf00/glance-log/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.694473 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c616426c-57a8-42a0-8dde-7ef7f56caf00/glance-httpd/0.log" Jan 21 09:49:08 crc kubenswrapper[4618]: I0121 09:49:08.932003 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-87c49d4f8-74x7z_d4a5a9b2-1432-43cc-bfe1-58285caf06ea/horizon-log/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.014416 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-87c49d4f8-74x7z_d4a5a9b2-1432-43cc-bfe1-58285caf06ea/horizon/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.035669 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-476wc_efaee91b-ca19-44cc-b8b4-37f6bf34067a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.060128 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4b7z2_3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.234801 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76759dfdcd-gxbvm_b62e287a-7db9-4d83-aae5-9cc273fff127/keystone-api/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.253617 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3/kube-state-metrics/0.log" Jan 21 09:49:09 crc kubenswrapper[4618]: I0121 09:49:09.290730 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr_5ff62cc0-5880-4589-ac86-a671f9533ff4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:18 crc kubenswrapper[4618]: I0121 09:49:18.507673 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" event={"ID":"e499af3b-220f-44c7-96d3-0d2cc6a92eb2","Type":"ContainerStarted","Data":"b5ac7006daa546b02652bb6176c1070251ac66ed0c6357a8d129173364f7a83e"} Jan 21 09:49:18 crc kubenswrapper[4618]: I0121 09:49:18.525737 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" podStartSLOduration=1.598114101 podStartE2EDuration="12.525719989s" podCreationTimestamp="2026-01-21 09:49:06 +0000 UTC" firstStartedPulling="2026-01-21 09:49:06.692171104 +0000 UTC m=+2745.442638421" lastFinishedPulling="2026-01-21 09:49:17.619776991 +0000 UTC m=+2756.370244309" observedRunningTime="2026-01-21 09:49:18.525368878 +0000 UTC m=+2757.275836195" watchObservedRunningTime="2026-01-21 09:49:18.525719989 +0000 UTC m=+2757.276187306" Jan 21 09:49:23 crc kubenswrapper[4618]: I0121 09:49:23.551797 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0b3230e-10e8-4707-8944-b59b1870a4fc/memcached/0.log" Jan 21 09:49:23 crc kubenswrapper[4618]: I0121 09:49:23.634810 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58985598b5-rf45g_dd14fbe1-01de-41f5-9247-d15844d8c697/neutron-api/0.log" Jan 21 09:49:23 crc kubenswrapper[4618]: I0121 09:49:23.685114 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58985598b5-rf45g_dd14fbe1-01de-41f5-9247-d15844d8c697/neutron-httpd/0.log" Jan 21 09:49:23 crc kubenswrapper[4618]: I0121 09:49:23.715151 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm_0bbeab64-b3ee-4412-a66b-c5871248bddb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:23 crc kubenswrapper[4618]: I0121 09:49:23.870618 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9edd45d0-acce-46fa-b1d2-29ddc021d690/nova-api-log/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.149298 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9edd45d0-acce-46fa-b1d2-29ddc021d690/nova-api-api/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.246222 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e717fabb-c7f8-4c12-a063-e9a5b0d2a671/nova-cell0-conductor-conductor/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.312530 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198/nova-cell1-conductor-conductor/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.362360 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa/nova-cell1-novncproxy-novncproxy/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.419974 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-r4524_d826d9d4-6108-4f59-9c79-313f8f3b3d19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:24 crc kubenswrapper[4618]: I0121 09:49:24.482987 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bcd77836-0c95-4165-8e69-9f1851be8f50/nova-metadata-log/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.109670 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bcd77836-0c95-4165-8e69-9f1851be8f50/nova-metadata-metadata/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.207977 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_17677a46-bed7-4316-91a1-e7d842f83d91/nova-scheduler-scheduler/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.226930 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_424be4c5-5cc7-4641-b497-f01556c3d8ea/galera/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.236860 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_424be4c5-5cc7-4641-b497-f01556c3d8ea/mysql-bootstrap/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.260040 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33a99731-3bad-4a35-97bc-2431645071bb/galera/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.269630 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33a99731-3bad-4a35-97bc-2431645071bb/mysql-bootstrap/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.275586 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6039b2d9-1ca5-480a-a1a4-f5ec50e082aa/openstackclient/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.283245 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-svclb_8a225614-1514-4820-8eff-8d760ef9a0b3/openstack-network-exporter/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.293235 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovsdb-server/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.301035 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovs-vswitchd/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.308891 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovsdb-server-init/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.318855 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v4n6j_a889a44f-3ea7-4b43-b5ea-1f365a9611ac/ovn-controller/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.347110 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-25xdp_28d56297-035c-4b19-8135-4d63d60b9b62/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.353672 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00770641-2364-454a-9b73-663281ad8df0/ovn-northd/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.358711 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00770641-2364-454a-9b73-663281ad8df0/openstack-network-exporter/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.369684 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc37a59b-ed3a-4007-b6bc-da3078536c98/ovsdbserver-nb/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.374848 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc37a59b-ed3a-4007-b6bc-da3078536c98/openstack-network-exporter/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.388943 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93d72e0b-9c67-4d3c-8eaf-b40cbf04df89/ovsdbserver-sb/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.392256 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93d72e0b-9c67-4d3c-8eaf-b40cbf04df89/openstack-network-exporter/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.443468 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54d488db9b-swfld_4634a027-fe25-4458-9f23-b984afd7a60f/placement-log/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.488923 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54d488db9b-swfld_4634a027-fe25-4458-9f23-b984afd7a60f/placement-api/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.617569 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6594f517-1fec-47c9-909d-674c8a7f36dd/rabbitmq/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.621639 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6594f517-1fec-47c9-909d-674c8a7f36dd/setup-container/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.640170 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a257ccd-7e16-4450-810b-14a2dca56eab/rabbitmq/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.645768 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a257ccd-7e16-4450-810b-14a2dca56eab/setup-container/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.658912 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn_182b5ccb-0f34-47f6-b087-ceed41764dc6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.671583 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qg6fs_1d012e11-c226-4c6f-b646-6358036a6924/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.680331 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz_ac42dc63-60fa-42fe-8497-f7164e407083/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.691439 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s4dgr_1ce37433-9d98-4388-8374-b3a26afdd1c3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.702419 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bkf9g_aed2c63b-6043-45ca-90ac-b445dc0112fe/ssh-known-hosts-edpm-deployment/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.770875 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d5bd5664f-ncbh6_3fcbc9a4-5180-4530-8003-a54391ebbd6c/proxy-httpd/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.791809 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d5bd5664f-ncbh6_3fcbc9a4-5180-4530-8003-a54391ebbd6c/proxy-server/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.798943 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-l5bzv_a34a0fe1-3391-4b76-8274-d817bcca6d03/swift-ring-rebalance/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.839672 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-server/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.858189 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-replicator/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.861893 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-auditor/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.869039 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-reaper/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.875133 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-server/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.892099 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-replicator/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.896150 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-auditor/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.902428 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-updater/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.910854 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-server/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.924808 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-replicator/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.939505 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-auditor/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.946500 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-updater/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.955389 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-expirer/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.959105 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/rsync/0.log" Jan 21 09:49:25 crc kubenswrapper[4618]: I0121 09:49:25.965443 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/swift-recon-cron/0.log" Jan 21 09:49:26 crc kubenswrapper[4618]: I0121 09:49:26.044473 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn_4d8904ba-60fd-453f-884f-6fe7003c205f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:26 crc kubenswrapper[4618]: I0121 09:49:26.071616 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_17e85cf8-1423-4fd8-a5c0-367c58482277/tempest-tests-tempest-tests-runner/0.log" Jan 21 09:49:26 crc kubenswrapper[4618]: I0121 09:49:26.077818 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_807b7b22-0fea-4aa1-bb39-fe47c6ed13c9/test-operator-logs-container/0.log" Jan 21 09:49:26 crc kubenswrapper[4618]: I0121 09:49:26.100536 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh_05574b5d-bd37-4837-a247-9f1f5bb09d09/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.713452 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.721588 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.727936 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.789280 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.831946 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.845400 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.940646 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.948665 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 09:49:28 crc kubenswrapper[4618]: I0121 09:49:28.972432 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.219222 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.254256 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.356764 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.432510 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.457095 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.497796 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.573695 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.581917 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.595603 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.756756 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.903618 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.911410 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 09:49:29 crc kubenswrapper[4618]: I0121 09:49:29.938259 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.385472 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.449440 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.504711 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.535602 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.564845 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.597037 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.648826 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.659426 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.667553 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.668081 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.672910 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.678634 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.679463 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.684674 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.690504 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.692157 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.698268 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.707038 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.724499 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 09:49:31 crc kubenswrapper[4618]: I0121 09:49:31.731658 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 09:49:32 crc kubenswrapper[4618]: I0121 09:49:32.016214 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 09:49:32 crc kubenswrapper[4618]: I0121 09:49:32.024821 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 09:49:35 crc kubenswrapper[4618]: I0121 09:49:35.715233 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9t8g5_c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f/control-plane-machine-set-operator/0.log" Jan 21 09:49:35 crc kubenswrapper[4618]: I0121 09:49:35.736288 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/kube-rbac-proxy/0.log" Jan 21 09:49:35 crc kubenswrapper[4618]: I0121 09:49:35.744514 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/machine-api-operator/0.log" Jan 21 09:49:46 crc kubenswrapper[4618]: I0121 09:49:46.749348 4618 generic.go:334] "Generic (PLEG): container finished" podID="e499af3b-220f-44c7-96d3-0d2cc6a92eb2" containerID="b5ac7006daa546b02652bb6176c1070251ac66ed0c6357a8d129173364f7a83e" exitCode=0 Jan 21 09:49:46 crc kubenswrapper[4618]: I0121 09:49:46.749414 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" event={"ID":"e499af3b-220f-44c7-96d3-0d2cc6a92eb2","Type":"ContainerDied","Data":"b5ac7006daa546b02652bb6176c1070251ac66ed0c6357a8d129173364f7a83e"} Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.836493 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.863703 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-2n9gs"] Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.868841 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-2n9gs"] Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.872331 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzfzl\" (UniqueName: \"kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl\") pod \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.872408 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host\") pod \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\" (UID: \"e499af3b-220f-44c7-96d3-0d2cc6a92eb2\") " Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.872622 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host" (OuterVolumeSpecName: "host") pod "e499af3b-220f-44c7-96d3-0d2cc6a92eb2" (UID: "e499af3b-220f-44c7-96d3-0d2cc6a92eb2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.872926 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-host\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.877374 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl" (OuterVolumeSpecName: "kube-api-access-rzfzl") pod "e499af3b-220f-44c7-96d3-0d2cc6a92eb2" (UID: "e499af3b-220f-44c7-96d3-0d2cc6a92eb2"). InnerVolumeSpecName "kube-api-access-rzfzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:49:47 crc kubenswrapper[4618]: I0121 09:49:47.976028 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzfzl\" (UniqueName: \"kubernetes.io/projected/e499af3b-220f-44c7-96d3-0d2cc6a92eb2-kube-api-access-rzfzl\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.771256 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db3ebb50dc2510877e9b1969f1d1b5e7d15dbda7a8fa6c218712dbede54c4f0" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.771302 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-2n9gs" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.990904 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-mqcq7"] Jan 21 09:49:48 crc kubenswrapper[4618]: E0121 09:49:48.991397 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e499af3b-220f-44c7-96d3-0d2cc6a92eb2" containerName="container-00" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.991413 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e499af3b-220f-44c7-96d3-0d2cc6a92eb2" containerName="container-00" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.991639 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e499af3b-220f-44c7-96d3-0d2cc6a92eb2" containerName="container-00" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.992380 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.993497 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqs26\" (UniqueName: \"kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:48 crc kubenswrapper[4618]: I0121 09:49:48.993584 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.096357 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.096593 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.096964 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqs26\" (UniqueName: \"kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.113512 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqs26\" (UniqueName: \"kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26\") pod \"crc-debug-mqcq7\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.307060 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.548882 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e499af3b-220f-44c7-96d3-0d2cc6a92eb2" path="/var/lib/kubelet/pods/e499af3b-220f-44c7-96d3-0d2cc6a92eb2/volumes" Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.782303 4618 generic.go:334] "Generic (PLEG): container finished" podID="e36efbb1-8437-41f9-afb3-128956b8e438" containerID="530067f9a3ab096cb7b17ac51d6872745506423ce6dc98b7f86842dd1f408e97" exitCode=0 Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.782355 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" event={"ID":"e36efbb1-8437-41f9-afb3-128956b8e438","Type":"ContainerDied","Data":"530067f9a3ab096cb7b17ac51d6872745506423ce6dc98b7f86842dd1f408e97"} Jan 21 09:49:49 crc kubenswrapper[4618]: I0121 09:49:49.782391 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" event={"ID":"e36efbb1-8437-41f9-afb3-128956b8e438","Type":"ContainerStarted","Data":"f365def606dfd5c0dc740b70168baa9124c55c27198ad51609eb4a24451ad812"} Jan 21 09:49:50 crc kubenswrapper[4618]: I0121 09:49:50.195526 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-mqcq7"] Jan 21 09:49:50 crc kubenswrapper[4618]: I0121 09:49:50.206309 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-mqcq7"] Jan 21 09:49:50 crc kubenswrapper[4618]: I0121 09:49:50.870083 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.033675 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host\") pod \"e36efbb1-8437-41f9-afb3-128956b8e438\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.033797 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host" (OuterVolumeSpecName: "host") pod "e36efbb1-8437-41f9-afb3-128956b8e438" (UID: "e36efbb1-8437-41f9-afb3-128956b8e438"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.033878 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqs26\" (UniqueName: \"kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26\") pod \"e36efbb1-8437-41f9-afb3-128956b8e438\" (UID: \"e36efbb1-8437-41f9-afb3-128956b8e438\") " Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.034813 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e36efbb1-8437-41f9-afb3-128956b8e438-host\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.039284 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26" (OuterVolumeSpecName: "kube-api-access-xqs26") pod "e36efbb1-8437-41f9-afb3-128956b8e438" (UID: "e36efbb1-8437-41f9-afb3-128956b8e438"). InnerVolumeSpecName "kube-api-access-xqs26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.137683 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqs26\" (UniqueName: \"kubernetes.io/projected/e36efbb1-8437-41f9-afb3-128956b8e438-kube-api-access-xqs26\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.323459 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-6gjgx"] Jan 21 09:49:51 crc kubenswrapper[4618]: E0121 09:49:51.323931 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36efbb1-8437-41f9-afb3-128956b8e438" containerName="container-00" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.323947 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36efbb1-8437-41f9-afb3-128956b8e438" containerName="container-00" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.324202 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36efbb1-8437-41f9-afb3-128956b8e438" containerName="container-00" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.325003 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.341412 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.341486 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7mj\" (UniqueName: \"kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.443729 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.443809 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7mj\" (UniqueName: \"kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.444104 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.460135 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7mj\" (UniqueName: \"kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj\") pod \"crc-debug-6gjgx\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.549261 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36efbb1-8437-41f9-afb3-128956b8e438" path="/var/lib/kubelet/pods/e36efbb1-8437-41f9-afb3-128956b8e438/volumes" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.639501 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:51 crc kubenswrapper[4618]: W0121 09:49:51.659706 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c4e3b31_d328_4569_8c9e_5e3de85b8540.slice/crio-032ea45541c98c46187624cb71393ec60d8013e45ca640966588923d613a31ef WatchSource:0}: Error finding container 032ea45541c98c46187624cb71393ec60d8013e45ca640966588923d613a31ef: Status 404 returned error can't find the container with id 032ea45541c98c46187624cb71393ec60d8013e45ca640966588923d613a31ef Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.803305 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-mqcq7" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.803490 4618 scope.go:117] "RemoveContainer" containerID="530067f9a3ab096cb7b17ac51d6872745506423ce6dc98b7f86842dd1f408e97" Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.805096 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" event={"ID":"9c4e3b31-d328-4569-8c9e-5e3de85b8540","Type":"ContainerStarted","Data":"4095975a997ca4b5f025b657890a508b192ec905d7c43423e0d648b788cefde1"} Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.805157 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" event={"ID":"9c4e3b31-d328-4569-8c9e-5e3de85b8540","Type":"ContainerStarted","Data":"032ea45541c98c46187624cb71393ec60d8013e45ca640966588923d613a31ef"} Jan 21 09:49:51 crc kubenswrapper[4618]: I0121 09:49:51.819413 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" podStartSLOduration=0.819387234 podStartE2EDuration="819.387234ms" podCreationTimestamp="2026-01-21 09:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:49:51.814422568 +0000 UTC m=+2790.564889884" watchObservedRunningTime="2026-01-21 09:49:51.819387234 +0000 UTC m=+2790.569854551" Jan 21 09:49:52 crc kubenswrapper[4618]: I0121 09:49:52.818828 4618 generic.go:334] "Generic (PLEG): container finished" podID="9c4e3b31-d328-4569-8c9e-5e3de85b8540" containerID="4095975a997ca4b5f025b657890a508b192ec905d7c43423e0d648b788cefde1" exitCode=0 Jan 21 09:49:52 crc kubenswrapper[4618]: I0121 09:49:52.818951 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" event={"ID":"9c4e3b31-d328-4569-8c9e-5e3de85b8540","Type":"ContainerDied","Data":"4095975a997ca4b5f025b657890a508b192ec905d7c43423e0d648b788cefde1"} Jan 21 09:49:53 crc kubenswrapper[4618]: I0121 09:49:53.899557 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:53 crc kubenswrapper[4618]: I0121 09:49:53.925005 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-6gjgx"] Jan 21 09:49:53 crc kubenswrapper[4618]: I0121 09:49:53.931781 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kd9nr/crc-debug-6gjgx"] Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.100096 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh7mj\" (UniqueName: \"kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj\") pod \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.100287 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host\") pod \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\" (UID: \"9c4e3b31-d328-4569-8c9e-5e3de85b8540\") " Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.100355 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host" (OuterVolumeSpecName: "host") pod "9c4e3b31-d328-4569-8c9e-5e3de85b8540" (UID: "9c4e3b31-d328-4569-8c9e-5e3de85b8540"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.101497 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c4e3b31-d328-4569-8c9e-5e3de85b8540-host\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.109338 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj" (OuterVolumeSpecName: "kube-api-access-hh7mj") pod "9c4e3b31-d328-4569-8c9e-5e3de85b8540" (UID: "9c4e3b31-d328-4569-8c9e-5e3de85b8540"). InnerVolumeSpecName "kube-api-access-hh7mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.204084 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh7mj\" (UniqueName: \"kubernetes.io/projected/9c4e3b31-d328-4569-8c9e-5e3de85b8540-kube-api-access-hh7mj\") on node \"crc\" DevicePath \"\"" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.328737 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.341502 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.352569 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.837709 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="032ea45541c98c46187624cb71393ec60d8013e45ca640966588923d613a31ef" Jan 21 09:49:54 crc kubenswrapper[4618]: I0121 09:49:54.837751 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/crc-debug-6gjgx" Jan 21 09:49:55 crc kubenswrapper[4618]: I0121 09:49:55.552414 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4e3b31-d328-4569-8c9e-5e3de85b8540" path="/var/lib/kubelet/pods/9c4e3b31-d328-4569-8c9e-5e3de85b8540/volumes" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.783478 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-sqxmv_d7fc037d-6b85-473a-bd03-3a266430e4e2/nmstate-console-plugin/0.log" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.809283 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fdzmd_a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2/nmstate-handler/0.log" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.816357 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/nmstate-metrics/0.log" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.823904 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/kube-rbac-proxy/0.log" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.834278 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-dcjhc_80022532-8c85-41c8-8c65-a67f28411a13/nmstate-operator/0.log" Jan 21 09:49:58 crc kubenswrapper[4618]: I0121 09:49:58.844991 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lrckd_71e9ce01-3713-4cf6-a76e-ad21ac16e10e/nmstate-webhook/0.log" Jan 21 09:50:08 crc kubenswrapper[4618]: I0121 09:50:08.067776 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 09:50:08 crc kubenswrapper[4618]: I0121 09:50:08.073745 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 09:50:08 crc kubenswrapper[4618]: I0121 09:50:08.090444 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.176228 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.188892 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.203243 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.210373 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.215818 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.221219 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.226929 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.233221 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.243630 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.264756 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.278769 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.570509 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 09:50:09 crc kubenswrapper[4618]: I0121 09:50:09.579018 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.690592 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/extract/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.697655 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/util/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.706084 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/pull/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.716898 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/extract/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.725411 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/util/0.log" Jan 21 09:50:12 crc kubenswrapper[4618]: I0121 09:50:12.731740 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/pull/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.055736 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/registry-server/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.060335 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/extract-utilities/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.066574 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/extract-content/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.411175 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/registry-server/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.417742 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/extract-utilities/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.423827 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/extract-content/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.436417 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4mpc9_9eb45d53-b317-4346-9a4e-679ff4473d3d/marketplace-operator/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.529013 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/registry-server/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.533762 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/extract-utilities/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.541053 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/extract-content/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.916898 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/registry-server/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.921302 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/extract-utilities/0.log" Jan 21 09:50:13 crc kubenswrapper[4618]: I0121 09:50:13.928093 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/extract-content/0.log" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.803466 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:17 crc kubenswrapper[4618]: E0121 09:50:17.804314 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4e3b31-d328-4569-8c9e-5e3de85b8540" containerName="container-00" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.804328 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4e3b31-d328-4569-8c9e-5e3de85b8540" containerName="container-00" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.804522 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4e3b31-d328-4569-8c9e-5e3de85b8540" containerName="container-00" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.805684 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.818740 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.989659 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrpq\" (UniqueName: \"kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.990233 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:17 crc kubenswrapper[4618]: I0121 09:50:17.990330 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.093620 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrpq\" (UniqueName: \"kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.093728 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.093757 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.094216 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.094299 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.119773 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrpq\" (UniqueName: \"kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq\") pod \"redhat-marketplace-klj2f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.126892 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:18 crc kubenswrapper[4618]: I0121 09:50:18.573536 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:19 crc kubenswrapper[4618]: I0121 09:50:19.040064 4618 generic.go:334] "Generic (PLEG): container finished" podID="7076a5c8-6032-4981-9471-098b76ccb64f" containerID="05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78" exitCode=0 Jan 21 09:50:19 crc kubenswrapper[4618]: I0121 09:50:19.040235 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerDied","Data":"05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78"} Jan 21 09:50:19 crc kubenswrapper[4618]: I0121 09:50:19.040460 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerStarted","Data":"a03b9359255991fb08a67e79ce234c1deb1d57f26f1c49b8800bf4d7cc698aa7"} Jan 21 09:50:19 crc kubenswrapper[4618]: I0121 09:50:19.042218 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:50:20 crc kubenswrapper[4618]: I0121 09:50:20.050713 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerStarted","Data":"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c"} Jan 21 09:50:21 crc kubenswrapper[4618]: I0121 09:50:21.061777 4618 generic.go:334] "Generic (PLEG): container finished" podID="7076a5c8-6032-4981-9471-098b76ccb64f" containerID="4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c" exitCode=0 Jan 21 09:50:21 crc kubenswrapper[4618]: I0121 09:50:21.061878 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerDied","Data":"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c"} Jan 21 09:50:22 crc kubenswrapper[4618]: I0121 09:50:22.072283 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerStarted","Data":"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15"} Jan 21 09:50:22 crc kubenswrapper[4618]: I0121 09:50:22.096082 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-klj2f" podStartSLOduration=2.456452561 podStartE2EDuration="5.096063629s" podCreationTimestamp="2026-01-21 09:50:17 +0000 UTC" firstStartedPulling="2026-01-21 09:50:19.041946576 +0000 UTC m=+2817.792413893" lastFinishedPulling="2026-01-21 09:50:21.681557643 +0000 UTC m=+2820.432024961" observedRunningTime="2026-01-21 09:50:22.091523993 +0000 UTC m=+2820.841991309" watchObservedRunningTime="2026-01-21 09:50:22.096063629 +0000 UTC m=+2820.846530946" Jan 21 09:50:28 crc kubenswrapper[4618]: I0121 09:50:28.127862 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:28 crc kubenswrapper[4618]: I0121 09:50:28.128236 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:28 crc kubenswrapper[4618]: I0121 09:50:28.179526 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:29 crc kubenswrapper[4618]: I0121 09:50:29.159856 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:29 crc kubenswrapper[4618]: I0121 09:50:29.199763 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.824113 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.826105 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.840254 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.847051 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.847194 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.847342 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxlb\" (UniqueName: \"kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.948588 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxlb\" (UniqueName: \"kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.948654 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.948768 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.949253 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.949318 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:30 crc kubenswrapper[4618]: I0121 09:50:30.969266 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxlb\" (UniqueName: \"kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb\") pod \"redhat-operators-c72zs\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.137859 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-klj2f" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="registry-server" containerID="cri-o://419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15" gracePeriod=2 Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.141808 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.622203 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.663585 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.672848 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mrpq\" (UniqueName: \"kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq\") pod \"7076a5c8-6032-4981-9471-098b76ccb64f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.673546 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content\") pod \"7076a5c8-6032-4981-9471-098b76ccb64f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.673726 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities\") pod \"7076a5c8-6032-4981-9471-098b76ccb64f\" (UID: \"7076a5c8-6032-4981-9471-098b76ccb64f\") " Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.675237 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities" (OuterVolumeSpecName: "utilities") pod "7076a5c8-6032-4981-9471-098b76ccb64f" (UID: "7076a5c8-6032-4981-9471-098b76ccb64f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.680245 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq" (OuterVolumeSpecName: "kube-api-access-8mrpq") pod "7076a5c8-6032-4981-9471-098b76ccb64f" (UID: "7076a5c8-6032-4981-9471-098b76ccb64f"). InnerVolumeSpecName "kube-api-access-8mrpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.703109 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7076a5c8-6032-4981-9471-098b76ccb64f" (UID: "7076a5c8-6032-4981-9471-098b76ccb64f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.776936 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.776970 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7076a5c8-6032-4981-9471-098b76ccb64f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:31 crc kubenswrapper[4618]: I0121 09:50:31.776981 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mrpq\" (UniqueName: \"kubernetes.io/projected/7076a5c8-6032-4981-9471-098b76ccb64f-kube-api-access-8mrpq\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.145705 4618 generic.go:334] "Generic (PLEG): container finished" podID="7076a5c8-6032-4981-9471-098b76ccb64f" containerID="419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15" exitCode=0 Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.145763 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klj2f" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.145791 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerDied","Data":"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15"} Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.146066 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klj2f" event={"ID":"7076a5c8-6032-4981-9471-098b76ccb64f","Type":"ContainerDied","Data":"a03b9359255991fb08a67e79ce234c1deb1d57f26f1c49b8800bf4d7cc698aa7"} Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.146084 4618 scope.go:117] "RemoveContainer" containerID="419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.154609 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerID="c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf" exitCode=0 Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.154633 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerDied","Data":"c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf"} Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.154657 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerStarted","Data":"76ddccda16b4bf78867548192755e358e18facdeb51db180cf29bd4bba2e24cd"} Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.175684 4618 scope.go:117] "RemoveContainer" containerID="4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.197771 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.203552 4618 scope.go:117] "RemoveContainer" containerID="05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.214958 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-klj2f"] Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.241031 4618 scope.go:117] "RemoveContainer" containerID="419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15" Jan 21 09:50:32 crc kubenswrapper[4618]: E0121 09:50:32.247353 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15\": container with ID starting with 419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15 not found: ID does not exist" containerID="419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.247410 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15"} err="failed to get container status \"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15\": rpc error: code = NotFound desc = could not find container \"419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15\": container with ID starting with 419d24a38c8c6aa5208a35fdf8c35388e29404f5e1b52619c80cd2142d3ada15 not found: ID does not exist" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.247437 4618 scope.go:117] "RemoveContainer" containerID="4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c" Jan 21 09:50:32 crc kubenswrapper[4618]: E0121 09:50:32.247957 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c\": container with ID starting with 4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c not found: ID does not exist" containerID="4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.247990 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c"} err="failed to get container status \"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c\": rpc error: code = NotFound desc = could not find container \"4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c\": container with ID starting with 4dc8ac0903bbf6615b912a4c8991fd36ecfce3ad1e45eebf14ae93e06713035c not found: ID does not exist" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.248014 4618 scope.go:117] "RemoveContainer" containerID="05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78" Jan 21 09:50:32 crc kubenswrapper[4618]: E0121 09:50:32.248413 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78\": container with ID starting with 05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78 not found: ID does not exist" containerID="05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78" Jan 21 09:50:32 crc kubenswrapper[4618]: I0121 09:50:32.248441 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78"} err="failed to get container status \"05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78\": rpc error: code = NotFound desc = could not find container \"05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78\": container with ID starting with 05fd45c4449e5e5ce468f5ac9211078ffbbc995b386781981a745bd3937f4a78 not found: ID does not exist" Jan 21 09:50:33 crc kubenswrapper[4618]: I0121 09:50:33.165665 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerStarted","Data":"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb"} Jan 21 09:50:33 crc kubenswrapper[4618]: I0121 09:50:33.547268 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" path="/var/lib/kubelet/pods/7076a5c8-6032-4981-9471-098b76ccb64f/volumes" Jan 21 09:50:34 crc kubenswrapper[4618]: I0121 09:50:34.176641 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerID="0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb" exitCode=0 Jan 21 09:50:34 crc kubenswrapper[4618]: I0121 09:50:34.176739 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerDied","Data":"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb"} Jan 21 09:50:35 crc kubenswrapper[4618]: I0121 09:50:35.190398 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerStarted","Data":"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1"} Jan 21 09:50:35 crc kubenswrapper[4618]: I0121 09:50:35.210098 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c72zs" podStartSLOduration=2.721824071 podStartE2EDuration="5.210082728s" podCreationTimestamp="2026-01-21 09:50:30 +0000 UTC" firstStartedPulling="2026-01-21 09:50:32.156514989 +0000 UTC m=+2830.906982305" lastFinishedPulling="2026-01-21 09:50:34.644773645 +0000 UTC m=+2833.395240962" observedRunningTime="2026-01-21 09:50:35.204547901 +0000 UTC m=+2833.955015219" watchObservedRunningTime="2026-01-21 09:50:35.210082728 +0000 UTC m=+2833.960550045" Jan 21 09:50:41 crc kubenswrapper[4618]: I0121 09:50:41.142564 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:41 crc kubenswrapper[4618]: I0121 09:50:41.142891 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:41 crc kubenswrapper[4618]: I0121 09:50:41.186292 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:41 crc kubenswrapper[4618]: I0121 09:50:41.272462 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:41 crc kubenswrapper[4618]: I0121 09:50:41.421203 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.243513 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c72zs" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="registry-server" containerID="cri-o://afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1" gracePeriod=2 Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.642695 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.731136 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content\") pod \"9f935b69-751f-4269-aa5c-05c5ea73d857\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.731335 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities\") pod \"9f935b69-751f-4269-aa5c-05c5ea73d857\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.731388 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jxlb\" (UniqueName: \"kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb\") pod \"9f935b69-751f-4269-aa5c-05c5ea73d857\" (UID: \"9f935b69-751f-4269-aa5c-05c5ea73d857\") " Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.731710 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities" (OuterVolumeSpecName: "utilities") pod "9f935b69-751f-4269-aa5c-05c5ea73d857" (UID: "9f935b69-751f-4269-aa5c-05c5ea73d857"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.732281 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.738981 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb" (OuterVolumeSpecName: "kube-api-access-5jxlb") pod "9f935b69-751f-4269-aa5c-05c5ea73d857" (UID: "9f935b69-751f-4269-aa5c-05c5ea73d857"). InnerVolumeSpecName "kube-api-access-5jxlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.834671 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f935b69-751f-4269-aa5c-05c5ea73d857" (UID: "9f935b69-751f-4269-aa5c-05c5ea73d857"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.834803 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f935b69-751f-4269-aa5c-05c5ea73d857-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:43 crc kubenswrapper[4618]: I0121 09:50:43.834826 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jxlb\" (UniqueName: \"kubernetes.io/projected/9f935b69-751f-4269-aa5c-05c5ea73d857-kube-api-access-5jxlb\") on node \"crc\" DevicePath \"\"" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.251605 4618 generic.go:334] "Generic (PLEG): container finished" podID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerID="afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1" exitCode=0 Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.251649 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerDied","Data":"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1"} Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.251659 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c72zs" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.251684 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c72zs" event={"ID":"9f935b69-751f-4269-aa5c-05c5ea73d857","Type":"ContainerDied","Data":"76ddccda16b4bf78867548192755e358e18facdeb51db180cf29bd4bba2e24cd"} Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.251703 4618 scope.go:117] "RemoveContainer" containerID="afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.274559 4618 scope.go:117] "RemoveContainer" containerID="0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.290995 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.297665 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c72zs"] Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.301910 4618 scope.go:117] "RemoveContainer" containerID="c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.327952 4618 scope.go:117] "RemoveContainer" containerID="afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1" Jan 21 09:50:44 crc kubenswrapper[4618]: E0121 09:50:44.328381 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1\": container with ID starting with afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1 not found: ID does not exist" containerID="afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.328412 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1"} err="failed to get container status \"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1\": rpc error: code = NotFound desc = could not find container \"afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1\": container with ID starting with afa225726e5d3d3e94e8c66f616b97e5009bd33f4c0ae94944ee5da3238d49c1 not found: ID does not exist" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.328437 4618 scope.go:117] "RemoveContainer" containerID="0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb" Jan 21 09:50:44 crc kubenswrapper[4618]: E0121 09:50:44.328716 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb\": container with ID starting with 0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb not found: ID does not exist" containerID="0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.328736 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb"} err="failed to get container status \"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb\": rpc error: code = NotFound desc = could not find container \"0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb\": container with ID starting with 0a0545da08eab403f6a8eb3b84752ed6d6806e38b753cbca8f932674644198bb not found: ID does not exist" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.328748 4618 scope.go:117] "RemoveContainer" containerID="c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf" Jan 21 09:50:44 crc kubenswrapper[4618]: E0121 09:50:44.328957 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf\": container with ID starting with c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf not found: ID does not exist" containerID="c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf" Jan 21 09:50:44 crc kubenswrapper[4618]: I0121 09:50:44.328980 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf"} err="failed to get container status \"c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf\": rpc error: code = NotFound desc = could not find container \"c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf\": container with ID starting with c5eb9970100d02dd902fe3b2cbc44f6ce021058527710d66b370636dba03e1bf not found: ID does not exist" Jan 21 09:50:45 crc kubenswrapper[4618]: I0121 09:50:45.547400 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" path="/var/lib/kubelet/pods/9f935b69-751f-4269-aa5c-05c5ea73d857/volumes" Jan 21 09:50:56 crc kubenswrapper[4618]: I0121 09:50:56.958798 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:50:56 crc kubenswrapper[4618]: I0121 09:50:56.959497 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.269219 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.274725 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.294972 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.402587 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.417428 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 09:51:10 crc kubenswrapper[4618]: I0121 09:51:10.428743 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.202747 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.209187 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.215137 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.316381 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.363873 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.383002 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.456949 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.473370 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.482871 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.491521 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.494084 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.499950 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.500895 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.505387 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.511958 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.519799 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.522751 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.526378 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.565315 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.573489 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.913420 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.927720 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.971814 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 09:51:11 crc kubenswrapper[4618]: I0121 09:51:11.980363 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.001436 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.009841 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.036716 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.090530 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.158024 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.166766 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.180024 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.338583 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.829317 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.855684 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 09:51:12 crc kubenswrapper[4618]: I0121 09:51:12.865664 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.451643 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.452258 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9t8g5_c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f/control-plane-machine-set-operator/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.462136 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/kube-rbac-proxy/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.470918 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/machine-api-operator/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.494536 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.549060 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.575794 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.595871 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.622180 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.682671 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.694077 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 09:51:13 crc kubenswrapper[4618]: I0121 09:51:13.701740 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.079288 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.086027 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.095848 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.164745 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.200766 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.209791 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.285895 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.294925 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.319686 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.571267 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.581812 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.650339 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.662082 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.686787 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.734186 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.812566 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.822406 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.837498 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.860073 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-sqxmv_d7fc037d-6b85-473a-bd03-3a266430e4e2/nmstate-console-plugin/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.871812 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fdzmd_a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2/nmstate-handler/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.882488 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/nmstate-metrics/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.889859 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/kube-rbac-proxy/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.904803 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-dcjhc_80022532-8c85-41c8-8c65-a67f28411a13/nmstate-operator/0.log" Jan 21 09:51:14 crc kubenswrapper[4618]: I0121 09:51:14.914082 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lrckd_71e9ce01-3713-4cf6-a76e-ad21ac16e10e/nmstate-webhook/0.log" Jan 21 09:51:15 crc kubenswrapper[4618]: I0121 09:51:15.000652 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.079766 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.118188 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.158674 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.182464 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.201655 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.228172 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.281500 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.291170 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 09:51:16 crc kubenswrapper[4618]: I0121 09:51:16.301012 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.590302 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/kube-multus-additional-cni-plugins/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.597840 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/egress-router-binary-copy/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.603979 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/cni-plugins/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.611026 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/bond-cni-plugin/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.616618 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/routeoverride-cni/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.623986 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/whereabouts-cni-bincopy/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.630382 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/whereabouts-cni/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.663341 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-dp9f8_fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b/multus-admission-controller/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.668051 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-dp9f8_fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b/kube-rbac-proxy/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.726230 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/2.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.787294 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/3.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.816893 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpxzc_d164c95c-cb58-47e7-a3a3-7e7bce8b9743/network-metrics-daemon/0.log" Jan 21 09:51:17 crc kubenswrapper[4618]: I0121 09:51:17.821639 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpxzc_d164c95c-cb58-47e7-a3a3-7e7bce8b9743/kube-rbac-proxy/0.log" Jan 21 09:51:26 crc kubenswrapper[4618]: I0121 09:51:26.958599 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:51:26 crc kubenswrapper[4618]: I0121 09:51:26.959167 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.136101 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137157 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="extract-utilities" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137173 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="extract-utilities" Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137186 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="extract-utilities" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137192 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="extract-utilities" Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137209 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="extract-content" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137215 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="extract-content" Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137224 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137229 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137256 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="extract-content" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137262 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="extract-content" Jan 21 09:51:40 crc kubenswrapper[4618]: E0121 09:51:40.137270 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137276 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137474 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076a5c8-6032-4981-9471-098b76ccb64f" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.137499 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f935b69-751f-4269-aa5c-05c5ea73d857" containerName="registry-server" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.138844 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.158313 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.225636 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.225788 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.225882 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xn9m\" (UniqueName: \"kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.329210 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xn9m\" (UniqueName: \"kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.329399 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.329439 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.330004 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.330261 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.351558 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xn9m\" (UniqueName: \"kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m\") pod \"community-operators-w4mnh\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.464926 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:40 crc kubenswrapper[4618]: I0121 09:51:40.949466 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:41 crc kubenswrapper[4618]: I0121 09:51:41.687126 4618 generic.go:334] "Generic (PLEG): container finished" podID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerID="3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9" exitCode=0 Jan 21 09:51:41 crc kubenswrapper[4618]: I0121 09:51:41.687474 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerDied","Data":"3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9"} Jan 21 09:51:41 crc kubenswrapper[4618]: I0121 09:51:41.687503 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerStarted","Data":"3f6dc5b2909c74f55af47d9fbfda196f8b1b3a57bbb1ed23c8d49f4a620ebf89"} Jan 21 09:51:42 crc kubenswrapper[4618]: I0121 09:51:42.700425 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerStarted","Data":"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f"} Jan 21 09:51:43 crc kubenswrapper[4618]: I0121 09:51:43.713603 4618 generic.go:334] "Generic (PLEG): container finished" podID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerID="15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f" exitCode=0 Jan 21 09:51:43 crc kubenswrapper[4618]: I0121 09:51:43.713681 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerDied","Data":"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f"} Jan 21 09:51:44 crc kubenswrapper[4618]: I0121 09:51:44.725118 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerStarted","Data":"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3"} Jan 21 09:51:44 crc kubenswrapper[4618]: I0121 09:51:44.747394 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w4mnh" podStartSLOduration=2.232310232 podStartE2EDuration="4.747372102s" podCreationTimestamp="2026-01-21 09:51:40 +0000 UTC" firstStartedPulling="2026-01-21 09:51:41.68909714 +0000 UTC m=+2900.439564457" lastFinishedPulling="2026-01-21 09:51:44.204159009 +0000 UTC m=+2902.954626327" observedRunningTime="2026-01-21 09:51:44.738697039 +0000 UTC m=+2903.489164356" watchObservedRunningTime="2026-01-21 09:51:44.747372102 +0000 UTC m=+2903.497839420" Jan 21 09:51:50 crc kubenswrapper[4618]: I0121 09:51:50.465103 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:50 crc kubenswrapper[4618]: I0121 09:51:50.466220 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:50 crc kubenswrapper[4618]: I0121 09:51:50.501580 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:50 crc kubenswrapper[4618]: I0121 09:51:50.804698 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:50 crc kubenswrapper[4618]: I0121 09:51:50.845238 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:52 crc kubenswrapper[4618]: I0121 09:51:52.784326 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w4mnh" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="registry-server" containerID="cri-o://4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3" gracePeriod=2 Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.652430 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.712445 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xn9m\" (UniqueName: \"kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m\") pod \"9b04b0fb-0477-4d28-bf62-a30a213b5802\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.712531 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content\") pod \"9b04b0fb-0477-4d28-bf62-a30a213b5802\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.712666 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities\") pod \"9b04b0fb-0477-4d28-bf62-a30a213b5802\" (UID: \"9b04b0fb-0477-4d28-bf62-a30a213b5802\") " Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.713586 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities" (OuterVolumeSpecName: "utilities") pod "9b04b0fb-0477-4d28-bf62-a30a213b5802" (UID: "9b04b0fb-0477-4d28-bf62-a30a213b5802"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.714742 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.718629 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m" (OuterVolumeSpecName: "kube-api-access-2xn9m") pod "9b04b0fb-0477-4d28-bf62-a30a213b5802" (UID: "9b04b0fb-0477-4d28-bf62-a30a213b5802"). InnerVolumeSpecName "kube-api-access-2xn9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.752891 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b04b0fb-0477-4d28-bf62-a30a213b5802" (UID: "9b04b0fb-0477-4d28-bf62-a30a213b5802"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.795985 4618 generic.go:334] "Generic (PLEG): container finished" podID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerID="4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3" exitCode=0 Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.796037 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerDied","Data":"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3"} Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.796075 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4mnh" event={"ID":"9b04b0fb-0477-4d28-bf62-a30a213b5802","Type":"ContainerDied","Data":"3f6dc5b2909c74f55af47d9fbfda196f8b1b3a57bbb1ed23c8d49f4a620ebf89"} Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.796092 4618 scope.go:117] "RemoveContainer" containerID="4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.796233 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4mnh" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.825664 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xn9m\" (UniqueName: \"kubernetes.io/projected/9b04b0fb-0477-4d28-bf62-a30a213b5802-kube-api-access-2xn9m\") on node \"crc\" DevicePath \"\"" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.825713 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b04b0fb-0477-4d28-bf62-a30a213b5802-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.830887 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.831979 4618 scope.go:117] "RemoveContainer" containerID="15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.838594 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w4mnh"] Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.849825 4618 scope.go:117] "RemoveContainer" containerID="3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.882306 4618 scope.go:117] "RemoveContainer" containerID="4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3" Jan 21 09:51:53 crc kubenswrapper[4618]: E0121 09:51:53.882637 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3\": container with ID starting with 4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3 not found: ID does not exist" containerID="4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.882681 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3"} err="failed to get container status \"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3\": rpc error: code = NotFound desc = could not find container \"4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3\": container with ID starting with 4f3de037fa30a5b40ab4d3c3d8a4a73c9f09ba9adffb146df32b5870033eace3 not found: ID does not exist" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.882712 4618 scope.go:117] "RemoveContainer" containerID="15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f" Jan 21 09:51:53 crc kubenswrapper[4618]: E0121 09:51:53.883011 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f\": container with ID starting with 15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f not found: ID does not exist" containerID="15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.883054 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f"} err="failed to get container status \"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f\": rpc error: code = NotFound desc = could not find container \"15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f\": container with ID starting with 15b28a0b4affad8831a1db907a42d7e5f8f157ba730ddb846754621f0523670f not found: ID does not exist" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.883081 4618 scope.go:117] "RemoveContainer" containerID="3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9" Jan 21 09:51:53 crc kubenswrapper[4618]: E0121 09:51:53.883394 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9\": container with ID starting with 3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9 not found: ID does not exist" containerID="3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9" Jan 21 09:51:53 crc kubenswrapper[4618]: I0121 09:51:53.883428 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9"} err="failed to get container status \"3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9\": rpc error: code = NotFound desc = could not find container \"3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9\": container with ID starting with 3137448beb30721ef737f14473072f03ecd2a3e45446ea0b8be36f029e48aee9 not found: ID does not exist" Jan 21 09:51:55 crc kubenswrapper[4618]: I0121 09:51:55.550604 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" path="/var/lib/kubelet/pods/9b04b0fb-0477-4d28-bf62-a30a213b5802/volumes" Jan 21 09:51:56 crc kubenswrapper[4618]: I0121 09:51:56.959710 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:51:56 crc kubenswrapper[4618]: I0121 09:51:56.959769 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:51:56 crc kubenswrapper[4618]: I0121 09:51:56.959814 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:51:56 crc kubenswrapper[4618]: I0121 09:51:56.960323 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:51:56 crc kubenswrapper[4618]: I0121 09:51:56.960368 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095" gracePeriod=600 Jan 21 09:51:57 crc kubenswrapper[4618]: I0121 09:51:57.824920 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095" exitCode=0 Jan 21 09:51:57 crc kubenswrapper[4618]: I0121 09:51:57.824986 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095"} Jan 21 09:51:57 crc kubenswrapper[4618]: I0121 09:51:57.825327 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251"} Jan 21 09:51:57 crc kubenswrapper[4618]: I0121 09:51:57.825348 4618 scope.go:117] "RemoveContainer" containerID="243ada45141680833fd171d16dfd8b069dcc3f176c8b4c541ef3ea78b472c05e" Jan 21 09:54:26 crc kubenswrapper[4618]: I0121 09:54:26.959340 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:54:26 crc kubenswrapper[4618]: I0121 09:54:26.960218 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:54:56 crc kubenswrapper[4618]: I0121 09:54:56.958647 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:54:56 crc kubenswrapper[4618]: I0121 09:54:56.959371 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:55:26 crc kubenswrapper[4618]: I0121 09:55:26.959505 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 09:55:26 crc kubenswrapper[4618]: I0121 09:55:26.960128 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 09:55:26 crc kubenswrapper[4618]: I0121 09:55:26.960188 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 09:55:26 crc kubenswrapper[4618]: I0121 09:55:26.960590 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 09:55:26 crc kubenswrapper[4618]: I0121 09:55:26.960632 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" gracePeriod=600 Jan 21 09:55:27 crc kubenswrapper[4618]: E0121 09:55:27.076557 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:55:27 crc kubenswrapper[4618]: I0121 09:55:27.580182 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" exitCode=0 Jan 21 09:55:27 crc kubenswrapper[4618]: I0121 09:55:27.580431 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251"} Jan 21 09:55:27 crc kubenswrapper[4618]: I0121 09:55:27.580458 4618 scope.go:117] "RemoveContainer" containerID="d76c27caac0b2de1a12ca813f47411c3853858dac096a6b530057ccce98c3095" Jan 21 09:55:27 crc kubenswrapper[4618]: I0121 09:55:27.580823 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:55:27 crc kubenswrapper[4618]: E0121 09:55:27.581020 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:55:29 crc kubenswrapper[4618]: I0121 09:55:29.554887 4618 scope.go:117] "RemoveContainer" containerID="b5ac7006daa546b02652bb6176c1070251ac66ed0c6357a8d129173364f7a83e" Jan 21 09:55:42 crc kubenswrapper[4618]: I0121 09:55:42.539135 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:55:42 crc kubenswrapper[4618]: E0121 09:55:42.540385 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:55:53 crc kubenswrapper[4618]: I0121 09:55:53.538784 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:55:53 crc kubenswrapper[4618]: E0121 09:55:53.539958 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:56:06 crc kubenswrapper[4618]: I0121 09:56:06.538580 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:56:06 crc kubenswrapper[4618]: E0121 09:56:06.539463 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:56:19 crc kubenswrapper[4618]: I0121 09:56:19.538164 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:56:19 crc kubenswrapper[4618]: E0121 09:56:19.539254 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:56:29 crc kubenswrapper[4618]: I0121 09:56:29.615612 4618 scope.go:117] "RemoveContainer" containerID="4095975a997ca4b5f025b657890a508b192ec905d7c43423e0d648b788cefde1" Jan 21 09:56:30 crc kubenswrapper[4618]: I0121 09:56:30.538078 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:56:30 crc kubenswrapper[4618]: E0121 09:56:30.538446 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:56:41 crc kubenswrapper[4618]: I0121 09:56:41.543099 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:56:41 crc kubenswrapper[4618]: E0121 09:56:41.544072 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:56:52 crc kubenswrapper[4618]: I0121 09:56:52.538790 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:56:52 crc kubenswrapper[4618]: E0121 09:56:52.539259 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:57:06 crc kubenswrapper[4618]: I0121 09:57:06.539398 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:57:06 crc kubenswrapper[4618]: E0121 09:57:06.540859 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:57:17 crc kubenswrapper[4618]: I0121 09:57:17.538470 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:57:17 crc kubenswrapper[4618]: E0121 09:57:17.539275 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:57:30 crc kubenswrapper[4618]: I0121 09:57:30.537605 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:57:30 crc kubenswrapper[4618]: E0121 09:57:30.538407 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.820976 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:32 crc kubenswrapper[4618]: E0121 09:57:32.828202 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="extract-content" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.828223 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="extract-content" Jan 21 09:57:32 crc kubenswrapper[4618]: E0121 09:57:32.828265 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="extract-utilities" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.828272 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="extract-utilities" Jan 21 09:57:32 crc kubenswrapper[4618]: E0121 09:57:32.828287 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="registry-server" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.828293 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="registry-server" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.828651 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b04b0fb-0477-4d28-bf62-a30a213b5802" containerName="registry-server" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.835452 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.842757 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.991104 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.991263 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:32 crc kubenswrapper[4618]: I0121 09:57:32.991315 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgmp\" (UniqueName: \"kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.093293 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.093444 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.093497 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgmp\" (UniqueName: \"kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.094324 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.094324 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.116095 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgmp\" (UniqueName: \"kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp\") pod \"certified-operators-whwxq\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.151203 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.423954 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.634731 4618 generic.go:334] "Generic (PLEG): container finished" podID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerID="d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb" exitCode=0 Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.634838 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerDied","Data":"d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb"} Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.635027 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerStarted","Data":"298a303dd8698c4f4d5912bf0d8e6409c7e2762b89bd23f49526af00d402faea"} Jan 21 09:57:33 crc kubenswrapper[4618]: I0121 09:57:33.636346 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 09:57:34 crc kubenswrapper[4618]: I0121 09:57:34.643101 4618 generic.go:334] "Generic (PLEG): container finished" podID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerID="6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f" exitCode=0 Jan 21 09:57:34 crc kubenswrapper[4618]: I0121 09:57:34.643183 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerDied","Data":"6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f"} Jan 21 09:57:35 crc kubenswrapper[4618]: I0121 09:57:35.664030 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerStarted","Data":"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c"} Jan 21 09:57:35 crc kubenswrapper[4618]: I0121 09:57:35.677389 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whwxq" podStartSLOduration=2.121669 podStartE2EDuration="3.677370246s" podCreationTimestamp="2026-01-21 09:57:32 +0000 UTC" firstStartedPulling="2026-01-21 09:57:33.63608466 +0000 UTC m=+3252.386551977" lastFinishedPulling="2026-01-21 09:57:35.191785906 +0000 UTC m=+3253.942253223" observedRunningTime="2026-01-21 09:57:35.67652129 +0000 UTC m=+3254.426988608" watchObservedRunningTime="2026-01-21 09:57:35.677370246 +0000 UTC m=+3254.427837564" Jan 21 09:57:41 crc kubenswrapper[4618]: I0121 09:57:41.542995 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:57:41 crc kubenswrapper[4618]: E0121 09:57:41.543859 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:57:43 crc kubenswrapper[4618]: E0121 09:57:43.139901 4618 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.98:53666->192.168.25.98:41395: read tcp 192.168.25.98:53666->192.168.25.98:41395: read: connection reset by peer Jan 21 09:57:43 crc kubenswrapper[4618]: I0121 09:57:43.152617 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:43 crc kubenswrapper[4618]: I0121 09:57:43.152655 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:43 crc kubenswrapper[4618]: I0121 09:57:43.190155 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:43 crc kubenswrapper[4618]: I0121 09:57:43.757898 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:43 crc kubenswrapper[4618]: I0121 09:57:43.800697 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:45 crc kubenswrapper[4618]: I0121 09:57:45.748914 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whwxq" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="registry-server" containerID="cri-o://d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c" gracePeriod=2 Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.204915 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.336928 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwgmp\" (UniqueName: \"kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp\") pod \"3b865325-c1a1-44f9-ac1f-2b414e52d499\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.336986 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities\") pod \"3b865325-c1a1-44f9-ac1f-2b414e52d499\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.337021 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content\") pod \"3b865325-c1a1-44f9-ac1f-2b414e52d499\" (UID: \"3b865325-c1a1-44f9-ac1f-2b414e52d499\") " Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.338629 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities" (OuterVolumeSpecName: "utilities") pod "3b865325-c1a1-44f9-ac1f-2b414e52d499" (UID: "3b865325-c1a1-44f9-ac1f-2b414e52d499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.343105 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp" (OuterVolumeSpecName: "kube-api-access-nwgmp") pod "3b865325-c1a1-44f9-ac1f-2b414e52d499" (UID: "3b865325-c1a1-44f9-ac1f-2b414e52d499"). InnerVolumeSpecName "kube-api-access-nwgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.371399 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b865325-c1a1-44f9-ac1f-2b414e52d499" (UID: "3b865325-c1a1-44f9-ac1f-2b414e52d499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.439365 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwgmp\" (UniqueName: \"kubernetes.io/projected/3b865325-c1a1-44f9-ac1f-2b414e52d499-kube-api-access-nwgmp\") on node \"crc\" DevicePath \"\"" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.439404 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.439418 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b865325-c1a1-44f9-ac1f-2b414e52d499-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.777682 4618 generic.go:334] "Generic (PLEG): container finished" podID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerID="d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c" exitCode=0 Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.777738 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerDied","Data":"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c"} Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.777788 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whwxq" event={"ID":"3b865325-c1a1-44f9-ac1f-2b414e52d499","Type":"ContainerDied","Data":"298a303dd8698c4f4d5912bf0d8e6409c7e2762b89bd23f49526af00d402faea"} Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.777807 4618 scope.go:117] "RemoveContainer" containerID="d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.777986 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whwxq" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.805407 4618 scope.go:117] "RemoveContainer" containerID="6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.836199 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.840246 4618 scope.go:117] "RemoveContainer" containerID="d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.845314 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whwxq"] Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.876120 4618 scope.go:117] "RemoveContainer" containerID="d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c" Jan 21 09:57:46 crc kubenswrapper[4618]: E0121 09:57:46.877348 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c\": container with ID starting with d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c not found: ID does not exist" containerID="d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.877459 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c"} err="failed to get container status \"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c\": rpc error: code = NotFound desc = could not find container \"d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c\": container with ID starting with d7462d6da3a366c6db4aa950db45f3a6641c59454236aeada490b63a586cef4c not found: ID does not exist" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.877553 4618 scope.go:117] "RemoveContainer" containerID="6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f" Jan 21 09:57:46 crc kubenswrapper[4618]: E0121 09:57:46.879901 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f\": container with ID starting with 6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f not found: ID does not exist" containerID="6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.879932 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f"} err="failed to get container status \"6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f\": rpc error: code = NotFound desc = could not find container \"6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f\": container with ID starting with 6a6e3bd01a393d2def1ee54871e09e1299d2d63ebeccd07a40f723c7cb415f9f not found: ID does not exist" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.879952 4618 scope.go:117] "RemoveContainer" containerID="d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb" Jan 21 09:57:46 crc kubenswrapper[4618]: E0121 09:57:46.880677 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb\": container with ID starting with d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb not found: ID does not exist" containerID="d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb" Jan 21 09:57:46 crc kubenswrapper[4618]: I0121 09:57:46.880705 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb"} err="failed to get container status \"d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb\": rpc error: code = NotFound desc = could not find container \"d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb\": container with ID starting with d30eed30641a7c9ba58a7703bcf2299e7090d1fc4ead15af65bf7f28736fb7eb not found: ID does not exist" Jan 21 09:57:47 crc kubenswrapper[4618]: I0121 09:57:47.544973 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" path="/var/lib/kubelet/pods/3b865325-c1a1-44f9-ac1f-2b414e52d499/volumes" Jan 21 09:57:50 crc kubenswrapper[4618]: E0121 09:57:50.698235 4618 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.98:39348->192.168.25.98:41395: write tcp 192.168.25.98:39348->192.168.25.98:41395: write: broken pipe Jan 21 09:57:52 crc kubenswrapper[4618]: I0121 09:57:52.537782 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:57:52 crc kubenswrapper[4618]: E0121 09:57:52.538414 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:58:06 crc kubenswrapper[4618]: I0121 09:58:06.538596 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:58:06 crc kubenswrapper[4618]: E0121 09:58:06.539483 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:58:17 crc kubenswrapper[4618]: I0121 09:58:17.538430 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:58:17 crc kubenswrapper[4618]: E0121 09:58:17.545343 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:58:24 crc kubenswrapper[4618]: I0121 09:58:24.080385 4618 generic.go:334] "Generic (PLEG): container finished" podID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerID="ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a" exitCode=0 Jan 21 09:58:24 crc kubenswrapper[4618]: I0121 09:58:24.080488 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" event={"ID":"5bde442f-35fc-4320-a8b8-23f1f0b7a18b","Type":"ContainerDied","Data":"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a"} Jan 21 09:58:24 crc kubenswrapper[4618]: I0121 09:58:24.081594 4618 scope.go:117] "RemoveContainer" containerID="ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a" Jan 21 09:58:24 crc kubenswrapper[4618]: I0121 09:58:24.614953 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kd9nr_must-gather-w2c2x_5bde442f-35fc-4320-a8b8-23f1f0b7a18b/gather/0.log" Jan 21 09:58:29 crc kubenswrapper[4618]: I0121 09:58:29.538413 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:58:29 crc kubenswrapper[4618]: E0121 09:58:29.542036 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.365365 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kd9nr/must-gather-w2c2x"] Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.366187 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="copy" containerID="cri-o://06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935" gracePeriod=2 Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.375516 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kd9nr/must-gather-w2c2x"] Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.755421 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kd9nr_must-gather-w2c2x_5bde442f-35fc-4320-a8b8-23f1f0b7a18b/copy/0.log" Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.756309 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.871919 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j27pn\" (UniqueName: \"kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn\") pod \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.872135 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output\") pod \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\" (UID: \"5bde442f-35fc-4320-a8b8-23f1f0b7a18b\") " Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.878833 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn" (OuterVolumeSpecName: "kube-api-access-j27pn") pod "5bde442f-35fc-4320-a8b8-23f1f0b7a18b" (UID: "5bde442f-35fc-4320-a8b8-23f1f0b7a18b"). InnerVolumeSpecName "kube-api-access-j27pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 09:58:32 crc kubenswrapper[4618]: I0121 09:58:32.982436 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j27pn\" (UniqueName: \"kubernetes.io/projected/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-kube-api-access-j27pn\") on node \"crc\" DevicePath \"\"" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.017372 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5bde442f-35fc-4320-a8b8-23f1f0b7a18b" (UID: "5bde442f-35fc-4320-a8b8-23f1f0b7a18b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.084622 4618 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5bde442f-35fc-4320-a8b8-23f1f0b7a18b-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.172929 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kd9nr_must-gather-w2c2x_5bde442f-35fc-4320-a8b8-23f1f0b7a18b/copy/0.log" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.176471 4618 generic.go:334] "Generic (PLEG): container finished" podID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerID="06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935" exitCode=143 Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.176523 4618 scope.go:117] "RemoveContainer" containerID="06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.176542 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kd9nr/must-gather-w2c2x" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.199267 4618 scope.go:117] "RemoveContainer" containerID="ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.271834 4618 scope.go:117] "RemoveContainer" containerID="06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935" Jan 21 09:58:33 crc kubenswrapper[4618]: E0121 09:58:33.272245 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935\": container with ID starting with 06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935 not found: ID does not exist" containerID="06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.272274 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935"} err="failed to get container status \"06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935\": rpc error: code = NotFound desc = could not find container \"06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935\": container with ID starting with 06c3b88f873906089b72ec44063acfe6100ad2f4ce19d8e8c777b98a8ca5e935 not found: ID does not exist" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.272292 4618 scope.go:117] "RemoveContainer" containerID="ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a" Jan 21 09:58:33 crc kubenswrapper[4618]: E0121 09:58:33.272647 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a\": container with ID starting with ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a not found: ID does not exist" containerID="ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.272666 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a"} err="failed to get container status \"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a\": rpc error: code = NotFound desc = could not find container \"ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a\": container with ID starting with ff584deb524949eebf0b3337ac749d7f6d2a59c2ad1aa32cbe0e9c4c29560b0a not found: ID does not exist" Jan 21 09:58:33 crc kubenswrapper[4618]: I0121 09:58:33.545568 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" path="/var/lib/kubelet/pods/5bde442f-35fc-4320-a8b8-23f1f0b7a18b/volumes" Jan 21 09:58:41 crc kubenswrapper[4618]: I0121 09:58:41.543492 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:58:41 crc kubenswrapper[4618]: E0121 09:58:41.544490 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:58:55 crc kubenswrapper[4618]: I0121 09:58:55.538667 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:58:55 crc kubenswrapper[4618]: E0121 09:58:55.539583 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:59:09 crc kubenswrapper[4618]: I0121 09:59:09.538185 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:59:09 crc kubenswrapper[4618]: E0121 09:59:09.538791 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:59:24 crc kubenswrapper[4618]: I0121 09:59:24.538193 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:59:24 crc kubenswrapper[4618]: E0121 09:59:24.539008 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:59:35 crc kubenswrapper[4618]: I0121 09:59:35.538192 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:59:35 crc kubenswrapper[4618]: E0121 09:59:35.538929 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.834850 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pm2m4/must-gather-mv5cc"] Jan 21 09:59:46 crc kubenswrapper[4618]: E0121 09:59:46.835839 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="gather" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.835853 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="gather" Jan 21 09:59:46 crc kubenswrapper[4618]: E0121 09:59:46.835863 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="copy" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.835868 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="copy" Jan 21 09:59:46 crc kubenswrapper[4618]: E0121 09:59:46.835874 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="registry-server" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.835880 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="registry-server" Jan 21 09:59:46 crc kubenswrapper[4618]: E0121 09:59:46.835905 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="extract-content" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.835912 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="extract-content" Jan 21 09:59:46 crc kubenswrapper[4618]: E0121 09:59:46.835924 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="extract-utilities" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.835930 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="extract-utilities" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.836104 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b865325-c1a1-44f9-ac1f-2b414e52d499" containerName="registry-server" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.836118 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="gather" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.836134 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bde442f-35fc-4320-a8b8-23f1f0b7a18b" containerName="copy" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.837051 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.838946 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pm2m4"/"openshift-service-ca.crt" Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.845563 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pm2m4/must-gather-mv5cc"] Jan 21 09:59:46 crc kubenswrapper[4618]: I0121 09:59:46.845640 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pm2m4"/"kube-root-ca.crt" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.035502 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgd9\" (UniqueName: \"kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.035704 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.137806 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgd9\" (UniqueName: \"kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.137944 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.138351 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.155630 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgd9\" (UniqueName: \"kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9\") pod \"must-gather-mv5cc\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.451716 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.539166 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 09:59:47 crc kubenswrapper[4618]: E0121 09:59:47.539470 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 09:59:47 crc kubenswrapper[4618]: I0121 09:59:47.839937 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pm2m4/must-gather-mv5cc"] Jan 21 09:59:48 crc kubenswrapper[4618]: I0121 09:59:48.783265 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" event={"ID":"8f533448-f67c-46a4-bda8-f432fd43e484","Type":"ContainerStarted","Data":"e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805"} Jan 21 09:59:48 crc kubenswrapper[4618]: I0121 09:59:48.783713 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" event={"ID":"8f533448-f67c-46a4-bda8-f432fd43e484","Type":"ContainerStarted","Data":"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3"} Jan 21 09:59:48 crc kubenswrapper[4618]: I0121 09:59:48.783787 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" event={"ID":"8f533448-f67c-46a4-bda8-f432fd43e484","Type":"ContainerStarted","Data":"968801c01eb54c2b2f128afd5a2c44855ef300da024135f147ff00469033c8df"} Jan 21 09:59:48 crc kubenswrapper[4618]: I0121 09:59:48.813191 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" podStartSLOduration=2.813169622 podStartE2EDuration="2.813169622s" podCreationTimestamp="2026-01-21 09:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:59:48.805067369 +0000 UTC m=+3387.555534686" watchObservedRunningTime="2026-01-21 09:59:48.813169622 +0000 UTC m=+3387.563636939" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.079696 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-tsxhc"] Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.081481 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.083382 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pm2m4"/"default-dockercfg-7z4sw" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.135985 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.136357 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrm9p\" (UniqueName: \"kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.238889 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.239022 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.239268 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrm9p\" (UniqueName: \"kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.264876 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrm9p\" (UniqueName: \"kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p\") pod \"crc-debug-tsxhc\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.396125 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 09:59:51 crc kubenswrapper[4618]: W0121 09:59:51.420687 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0086620_a966_4319_b0ff_2645ad2dc1ba.slice/crio-337dfc7a1b500a1729bb7efd435461159d247db22f277e12d14785400bc7e940 WatchSource:0}: Error finding container 337dfc7a1b500a1729bb7efd435461159d247db22f277e12d14785400bc7e940: Status 404 returned error can't find the container with id 337dfc7a1b500a1729bb7efd435461159d247db22f277e12d14785400bc7e940 Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.811076 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" event={"ID":"f0086620-a966-4319-b0ff-2645ad2dc1ba","Type":"ContainerStarted","Data":"fc41cf79517a6c9e8336b5f5118abe2ae21fd1c1738587887c7f78893ea1f0ea"} Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.811837 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" event={"ID":"f0086620-a966-4319-b0ff-2645ad2dc1ba","Type":"ContainerStarted","Data":"337dfc7a1b500a1729bb7efd435461159d247db22f277e12d14785400bc7e940"} Jan 21 09:59:51 crc kubenswrapper[4618]: I0121 09:59:51.828955 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" podStartSLOduration=0.82894249 podStartE2EDuration="828.94249ms" podCreationTimestamp="2026-01-21 09:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 09:59:51.82246372 +0000 UTC m=+3390.572931037" watchObservedRunningTime="2026-01-21 09:59:51.82894249 +0000 UTC m=+3390.579409807" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.465312 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68d7cbc6d4-mthph_add10569-0b7d-47e6-a9fc-943ff2f54fc4/barbican-api-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.475882 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68d7cbc6d4-mthph_add10569-0b7d-47e6-a9fc-943ff2f54fc4/barbican-api/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.510063 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c95ff478-nbsz8_4c1702d5-7295-4662-956a-180ac3b7c04d/barbican-keystone-listener-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.517989 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c95ff478-nbsz8_4c1702d5-7295-4662-956a-180ac3b7c04d/barbican-keystone-listener/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.537657 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df6b49cb5-9npwx_65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b/barbican-worker-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.550257 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df6b49cb5-9npwx_65fa8a36-04b0-4ca6-9ca3-bf4efc359c9b/barbican-worker/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.580925 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5s2hn_194d2dc3-6b5c-4cdf-aa65-3d6c088d1e90/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.611953 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/ceilometer-central-agent/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.637573 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/ceilometer-notification-agent/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.642682 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/sg-core/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.654641 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2ceaf0a-1783-4b11-9f67-a5c8948c589d/proxy-httpd/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.667311 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3c710262-7141-4edf-8f70-b5ee3d235970/cinder-api-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.711849 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3c710262-7141-4edf-8f70-b5ee3d235970/cinder-api/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.765123 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_def78b06-bd3c-4722-82a7-15b80abe36fe/cinder-scheduler/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.793158 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_def78b06-bd3c-4722-82a7-15b80abe36fe/probe/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.818109 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-44vvj_1b8522ab-9a18-468c-a001-27aa7228e059/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.839757 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9bzq9_3d61b58a-5231-47ee-8d01-2eb51a1def0c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.888476 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7b79b84c-h9xrw_9f529ba0-9024-4b63-8d19-bb798710ce6f/dnsmasq-dns/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.892550 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d7b79b84c-h9xrw_9f529ba0-9024-4b63-8d19-bb798710ce6f/init/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.920313 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nk2g2_9fab9896-c90d-47af-9a73-4cf53b19d631/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.931384 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f8cbddef-d1fd-490f-b499-3a9d2e570bce/glance-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.947161 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f8cbddef-d1fd-490f-b499-3a9d2e570bce/glance-httpd/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.957048 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c616426c-57a8-42a0-8dde-7ef7f56caf00/glance-log/0.log" Jan 21 09:59:53 crc kubenswrapper[4618]: I0121 09:59:53.974058 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c616426c-57a8-42a0-8dde-7ef7f56caf00/glance-httpd/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.264002 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-87c49d4f8-74x7z_d4a5a9b2-1432-43cc-bfe1-58285caf06ea/horizon-log/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.341975 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-87c49d4f8-74x7z_d4a5a9b2-1432-43cc-bfe1-58285caf06ea/horizon/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.362265 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-476wc_efaee91b-ca19-44cc-b8b4-37f6bf34067a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.387259 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4b7z2_3c8e6c05-f3d4-4215-bda3-1cb4b69cbfde/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.537483 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76759dfdcd-gxbvm_b62e287a-7db9-4d83-aae5-9cc273fff127/keystone-api/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.545688 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8bca57d0-25ca-4daa-a96b-b3b70b2a2ac3/kube-state-metrics/0.log" Jan 21 09:59:54 crc kubenswrapper[4618]: I0121 09:59:54.575673 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7dcvr_5ff62cc0-5880-4589-ac86-a671f9533ff4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.141116 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb"] Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.144212 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.147419 4618 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.147626 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.150529 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb"] Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.222902 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.223112 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.223334 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc2vg\" (UniqueName: \"kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.326797 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.327274 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc2vg\" (UniqueName: \"kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.327403 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.328353 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.352820 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.355561 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc2vg\" (UniqueName: \"kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg\") pod \"collect-profiles-29483160-jktjb\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.471032 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:00 crc kubenswrapper[4618]: I0121 10:00:00.887330 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb"] Jan 21 10:00:01 crc kubenswrapper[4618]: I0121 10:00:01.896882 4618 generic.go:334] "Generic (PLEG): container finished" podID="f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" containerID="34c2dd12347918b4e8d84186354e307ead5ca187a69186329b15e3d48ad7f7d7" exitCode=0 Jan 21 10:00:01 crc kubenswrapper[4618]: I0121 10:00:01.897181 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" event={"ID":"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5","Type":"ContainerDied","Data":"34c2dd12347918b4e8d84186354e307ead5ca187a69186329b15e3d48ad7f7d7"} Jan 21 10:00:01 crc kubenswrapper[4618]: I0121 10:00:01.897207 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" event={"ID":"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5","Type":"ContainerStarted","Data":"d66d275ad7bfbc3b48b89d424bc0b15090216281ab151c3e86d0e2e642a80895"} Jan 21 10:00:02 crc kubenswrapper[4618]: I0121 10:00:02.538060 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 10:00:02 crc kubenswrapper[4618]: E0121 10:00:02.538595 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.284866 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.303608 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume\") pod \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.303722 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume\") pod \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.303877 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc2vg\" (UniqueName: \"kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg\") pod \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\" (UID: \"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5\") " Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.304814 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" (UID: "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.327577 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" (UID: "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.328801 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg" (OuterVolumeSpecName: "kube-api-access-xc2vg") pod "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" (UID: "f09fc3b3-a435-49e8-b91e-4f083bd5a3b5"). InnerVolumeSpecName "kube-api-access-xc2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.405904 4618 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.405938 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc2vg\" (UniqueName: \"kubernetes.io/projected/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-kube-api-access-xc2vg\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.405949 4618 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f09fc3b3-a435-49e8-b91e-4f083bd5a3b5-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.912469 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" event={"ID":"f09fc3b3-a435-49e8-b91e-4f083bd5a3b5","Type":"ContainerDied","Data":"d66d275ad7bfbc3b48b89d424bc0b15090216281ab151c3e86d0e2e642a80895"} Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.913071 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66d275ad7bfbc3b48b89d424bc0b15090216281ab151c3e86d0e2e642a80895" Jan 21 10:00:03 crc kubenswrapper[4618]: I0121 10:00:03.913240 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483160-jktjb" Jan 21 10:00:04 crc kubenswrapper[4618]: I0121 10:00:04.356066 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl"] Jan 21 10:00:04 crc kubenswrapper[4618]: I0121 10:00:04.361282 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483115-lw6gl"] Jan 21 10:00:05 crc kubenswrapper[4618]: I0121 10:00:05.549724 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36179011-6684-4dd8-9436-d4185ce96397" path="/var/lib/kubelet/pods/36179011-6684-4dd8-9436-d4185ce96397/volumes" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.189691 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0b3230e-10e8-4707-8944-b59b1870a4fc/memcached/0.log" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.269701 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58985598b5-rf45g_dd14fbe1-01de-41f5-9247-d15844d8c697/neutron-api/0.log" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.328916 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-58985598b5-rf45g_dd14fbe1-01de-41f5-9247-d15844d8c697/neutron-httpd/0.log" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.448391 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-n97pm_0bbeab64-b3ee-4412-a66b-c5871248bddb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.609928 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9edd45d0-acce-46fa-b1d2-29ddc021d690/nova-api-log/0.log" Jan 21 10:00:11 crc kubenswrapper[4618]: I0121 10:00:11.997999 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9edd45d0-acce-46fa-b1d2-29ddc021d690/nova-api-api/0.log" Jan 21 10:00:12 crc kubenswrapper[4618]: I0121 10:00:12.169874 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e717fabb-c7f8-4c12-a063-e9a5b0d2a671/nova-cell0-conductor-conductor/0.log" Jan 21 10:00:12 crc kubenswrapper[4618]: I0121 10:00:12.296021 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7dd5d2fd-6aeb-4bec-88ce-4d6ae0887198/nova-cell1-conductor-conductor/0.log" Jan 21 10:00:12 crc kubenswrapper[4618]: I0121 10:00:12.392114 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d3c60646-177a-4ed0-ab65-6a9ba9f3b7aa/nova-cell1-novncproxy-novncproxy/0.log" Jan 21 10:00:12 crc kubenswrapper[4618]: I0121 10:00:12.441873 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-r4524_d826d9d4-6108-4f59-9c79-313f8f3b3d19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:12 crc kubenswrapper[4618]: I0121 10:00:12.514640 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bcd77836-0c95-4165-8e69-9f1851be8f50/nova-metadata-log/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.375782 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bcd77836-0c95-4165-8e69-9f1851be8f50/nova-metadata-metadata/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.535862 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_17677a46-bed7-4316-91a1-e7d842f83d91/nova-scheduler-scheduler/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.538519 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 10:00:13 crc kubenswrapper[4618]: E0121 10:00:13.538779 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.563463 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_424be4c5-5cc7-4641-b497-f01556c3d8ea/galera/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.573171 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_424be4c5-5cc7-4641-b497-f01556c3d8ea/mysql-bootstrap/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.596552 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33a99731-3bad-4a35-97bc-2431645071bb/galera/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.620960 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_33a99731-3bad-4a35-97bc-2431645071bb/mysql-bootstrap/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.627570 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6039b2d9-1ca5-480a-a1a4-f5ec50e082aa/openstackclient/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.639641 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-svclb_8a225614-1514-4820-8eff-8d760ef9a0b3/openstack-network-exporter/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.652176 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovsdb-server/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.675926 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovs-vswitchd/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.682178 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z8pqg_27ea43db-9444-46a2-aa4f-824245113798/ovsdb-server-init/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.694485 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v4n6j_a889a44f-3ea7-4b43-b5ea-1f365a9611ac/ovn-controller/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.737040 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-25xdp_28d56297-035c-4b19-8135-4d63d60b9b62/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.747665 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00770641-2364-454a-9b73-663281ad8df0/ovn-northd/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.753013 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00770641-2364-454a-9b73-663281ad8df0/openstack-network-exporter/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.770997 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc37a59b-ed3a-4007-b6bc-da3078536c98/ovsdbserver-nb/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.774662 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc37a59b-ed3a-4007-b6bc-da3078536c98/openstack-network-exporter/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.789320 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93d72e0b-9c67-4d3c-8eaf-b40cbf04df89/ovsdbserver-sb/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.793411 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93d72e0b-9c67-4d3c-8eaf-b40cbf04df89/openstack-network-exporter/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.901165 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54d488db9b-swfld_4634a027-fe25-4458-9f23-b984afd7a60f/placement-log/0.log" Jan 21 10:00:13 crc kubenswrapper[4618]: I0121 10:00:13.968079 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54d488db9b-swfld_4634a027-fe25-4458-9f23-b984afd7a60f/placement-api/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.011930 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6594f517-1fec-47c9-909d-674c8a7f36dd/rabbitmq/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.016580 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6594f517-1fec-47c9-909d-674c8a7f36dd/setup-container/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.041805 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a257ccd-7e16-4450-810b-14a2dca56eab/rabbitmq/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.054852 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a257ccd-7e16-4450-810b-14a2dca56eab/setup-container/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.073933 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g44wn_182b5ccb-0f34-47f6-b087-ceed41764dc6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.083047 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qg6fs_1d012e11-c226-4c6f-b646-6358036a6924/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.092362 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r68kz_ac42dc63-60fa-42fe-8497-f7164e407083/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.100217 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s4dgr_1ce37433-9d98-4388-8374-b3a26afdd1c3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.112771 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bkf9g_aed2c63b-6043-45ca-90ac-b445dc0112fe/ssh-known-hosts-edpm-deployment/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.235839 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d5bd5664f-ncbh6_3fcbc9a4-5180-4530-8003-a54391ebbd6c/proxy-httpd/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.253357 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d5bd5664f-ncbh6_3fcbc9a4-5180-4530-8003-a54391ebbd6c/proxy-server/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.260992 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-l5bzv_a34a0fe1-3391-4b76-8274-d817bcca6d03/swift-ring-rebalance/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.286915 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-server/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.309507 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-replicator/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.314196 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-auditor/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.321389 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/account-reaper/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.328293 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-server/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.355502 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-replicator/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.359846 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-auditor/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.366604 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/container-updater/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.374387 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-server/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.394626 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-replicator/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.411807 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-auditor/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.418369 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-updater/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.427136 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/object-expirer/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.432170 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/rsync/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.438639 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67864b8b-0c06-4f06-8b43-87fcdd8a3d42/swift-recon-cron/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.500833 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-f8rcn_4d8904ba-60fd-453f-884f-6fe7003c205f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.516515 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_17e85cf8-1423-4fd8-a5c0-367c58482277/tempest-tests-tempest-tests-runner/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.522217 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_807b7b22-0fea-4aa1-bb39-fe47c6ed13c9/test-operator-logs-container/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.537234 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dcnnh_05574b5d-bd37-4837-a247-9f1f5bb09d09/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.987266 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 10:00:14 crc kubenswrapper[4618]: I0121 10:00:14.992486 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 10:00:15 crc kubenswrapper[4618]: I0121 10:00:15.017887 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 10:00:15 crc kubenswrapper[4618]: I0121 10:00:15.822791 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 10:00:15 crc kubenswrapper[4618]: I0121 10:00:15.841284 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 10:00:15 crc kubenswrapper[4618]: I0121 10:00:15.850582 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 10:00:15 crc kubenswrapper[4618]: I0121 10:00:15.936056 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.010113 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.023290 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.153349 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.172314 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.206206 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.346649 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.355777 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.359442 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.369571 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.375543 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.390366 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.406541 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.411835 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.425927 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.459607 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.469853 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.479158 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.492028 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.580856 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.590717 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.624234 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.692663 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.818360 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.828540 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.846313 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.889110 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.893473 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 10:00:16 crc kubenswrapper[4618]: I0121 10:00:16.982718 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 10:00:17 crc kubenswrapper[4618]: I0121 10:00:17.887495 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 10:00:17 crc kubenswrapper[4618]: I0121 10:00:17.930794 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 10:00:17 crc kubenswrapper[4618]: I0121 10:00:17.971418 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 10:00:17 crc kubenswrapper[4618]: I0121 10:00:17.992655 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.008308 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.028168 4618 generic.go:334] "Generic (PLEG): container finished" podID="f0086620-a966-4319-b0ff-2645ad2dc1ba" containerID="fc41cf79517a6c9e8336b5f5118abe2ae21fd1c1738587887c7f78893ea1f0ea" exitCode=0 Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.028278 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" event={"ID":"f0086620-a966-4319-b0ff-2645ad2dc1ba","Type":"ContainerDied","Data":"fc41cf79517a6c9e8336b5f5118abe2ae21fd1c1738587887c7f78893ea1f0ea"} Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.031074 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.069119 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.077418 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 10:00:18 crc kubenswrapper[4618]: I0121 10:00:18.086442 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.108525 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.177202 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-tsxhc"] Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.190700 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-tsxhc"] Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.236482 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrm9p\" (UniqueName: \"kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p\") pod \"f0086620-a966-4319-b0ff-2645ad2dc1ba\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.236594 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host\") pod \"f0086620-a966-4319-b0ff-2645ad2dc1ba\" (UID: \"f0086620-a966-4319-b0ff-2645ad2dc1ba\") " Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.236689 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host" (OuterVolumeSpecName: "host") pod "f0086620-a966-4319-b0ff-2645ad2dc1ba" (UID: "f0086620-a966-4319-b0ff-2645ad2dc1ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.237471 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0086620-a966-4319-b0ff-2645ad2dc1ba-host\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.242374 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p" (OuterVolumeSpecName: "kube-api-access-xrm9p") pod "f0086620-a966-4319-b0ff-2645ad2dc1ba" (UID: "f0086620-a966-4319-b0ff-2645ad2dc1ba"). InnerVolumeSpecName "kube-api-access-xrm9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.340028 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrm9p\" (UniqueName: \"kubernetes.io/projected/f0086620-a966-4319-b0ff-2645ad2dc1ba-kube-api-access-xrm9p\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:19 crc kubenswrapper[4618]: I0121 10:00:19.547555 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0086620-a966-4319-b0ff-2645ad2dc1ba" path="/var/lib/kubelet/pods/f0086620-a966-4319-b0ff-2645ad2dc1ba/volumes" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.042411 4618 scope.go:117] "RemoveContainer" containerID="fc41cf79517a6c9e8336b5f5118abe2ae21fd1c1738587887c7f78893ea1f0ea" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.042721 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-tsxhc" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.344621 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-wht4q"] Jan 21 10:00:20 crc kubenswrapper[4618]: E0121 10:00:20.345099 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" containerName="collect-profiles" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.345115 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" containerName="collect-profiles" Jan 21 10:00:20 crc kubenswrapper[4618]: E0121 10:00:20.345154 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0086620-a966-4319-b0ff-2645ad2dc1ba" containerName="container-00" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.345161 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0086620-a966-4319-b0ff-2645ad2dc1ba" containerName="container-00" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.345374 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09fc3b3-a435-49e8-b91e-4f083bd5a3b5" containerName="collect-profiles" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.345391 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0086620-a966-4319-b0ff-2645ad2dc1ba" containerName="container-00" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.346026 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.347772 4618 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pm2m4"/"default-dockercfg-7z4sw" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.466368 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhsc\" (UniqueName: \"kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.466592 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.568939 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.569021 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhsc\" (UniqueName: \"kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.569081 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.586487 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhsc\" (UniqueName: \"kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc\") pod \"crc-debug-wht4q\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:20 crc kubenswrapper[4618]: I0121 10:00:20.659864 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:21 crc kubenswrapper[4618]: I0121 10:00:21.053861 4618 generic.go:334] "Generic (PLEG): container finished" podID="ef9470a3-7ee5-41d1-8022-ba90ae4da33c" containerID="a8b5e9711fbb7ae42aac4903b6aaa1d0a4dc0d96d28445d12e26c939028cecc2" exitCode=0 Jan 21 10:00:21 crc kubenswrapper[4618]: I0121 10:00:21.053916 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" event={"ID":"ef9470a3-7ee5-41d1-8022-ba90ae4da33c","Type":"ContainerDied","Data":"a8b5e9711fbb7ae42aac4903b6aaa1d0a4dc0d96d28445d12e26c939028cecc2"} Jan 21 10:00:21 crc kubenswrapper[4618]: I0121 10:00:21.053957 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" event={"ID":"ef9470a3-7ee5-41d1-8022-ba90ae4da33c","Type":"ContainerStarted","Data":"062f3c025c0537c45f6618d2d232805c445d08f614a0f5bc3dd74666701968a2"} Jan 21 10:00:21 crc kubenswrapper[4618]: I0121 10:00:21.525051 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-wht4q"] Jan 21 10:00:21 crc kubenswrapper[4618]: I0121 10:00:21.532782 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-wht4q"] Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.138228 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.207855 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9t8g5_c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f/control-plane-machine-set-operator/0.log" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.232079 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/kube-rbac-proxy/0.log" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.242504 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/machine-api-operator/0.log" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.306407 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klhsc\" (UniqueName: \"kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc\") pod \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.306504 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host\") pod \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\" (UID: \"ef9470a3-7ee5-41d1-8022-ba90ae4da33c\") " Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.306624 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host" (OuterVolumeSpecName: "host") pod "ef9470a3-7ee5-41d1-8022-ba90ae4da33c" (UID: "ef9470a3-7ee5-41d1-8022-ba90ae4da33c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.307412 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-host\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.313130 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc" (OuterVolumeSpecName: "kube-api-access-klhsc") pod "ef9470a3-7ee5-41d1-8022-ba90ae4da33c" (UID: "ef9470a3-7ee5-41d1-8022-ba90ae4da33c"). InnerVolumeSpecName "kube-api-access-klhsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.410392 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klhsc\" (UniqueName: \"kubernetes.io/projected/ef9470a3-7ee5-41d1-8022-ba90ae4da33c-kube-api-access-klhsc\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.750074 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-mf2md"] Jan 21 10:00:22 crc kubenswrapper[4618]: E0121 10:00:22.750529 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9470a3-7ee5-41d1-8022-ba90ae4da33c" containerName="container-00" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.750543 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9470a3-7ee5-41d1-8022-ba90ae4da33c" containerName="container-00" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.750727 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9470a3-7ee5-41d1-8022-ba90ae4da33c" containerName="container-00" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.751441 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.920765 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpvh\" (UniqueName: \"kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:22 crc kubenswrapper[4618]: I0121 10:00:22.920855 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.022669 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpvh\" (UniqueName: \"kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.022756 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.022911 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.043885 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpvh\" (UniqueName: \"kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh\") pod \"crc-debug-mf2md\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.066587 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.073389 4618 scope.go:117] "RemoveContainer" containerID="a8b5e9711fbb7ae42aac4903b6aaa1d0a4dc0d96d28445d12e26c939028cecc2" Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.073543 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-wht4q" Jan 21 10:00:23 crc kubenswrapper[4618]: W0121 10:00:23.138671 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7dcc8f6_cabd_46ca_a229_47e740a59439.slice/crio-cc406d0111caa653cc9434789fa72efe245b72e2db7fbdefccbe252fb54d6db2 WatchSource:0}: Error finding container cc406d0111caa653cc9434789fa72efe245b72e2db7fbdefccbe252fb54d6db2: Status 404 returned error can't find the container with id cc406d0111caa653cc9434789fa72efe245b72e2db7fbdefccbe252fb54d6db2 Jan 21 10:00:23 crc kubenswrapper[4618]: I0121 10:00:23.550055 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9470a3-7ee5-41d1-8022-ba90ae4da33c" path="/var/lib/kubelet/pods/ef9470a3-7ee5-41d1-8022-ba90ae4da33c/volumes" Jan 21 10:00:24 crc kubenswrapper[4618]: I0121 10:00:24.085971 4618 generic.go:334] "Generic (PLEG): container finished" podID="b7dcc8f6-cabd-46ca-a229-47e740a59439" containerID="d11c0bccb56b93177166eac79409d153427d2b76afb3ca0f0108f599b821c754" exitCode=0 Jan 21 10:00:24 crc kubenswrapper[4618]: I0121 10:00:24.085995 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" event={"ID":"b7dcc8f6-cabd-46ca-a229-47e740a59439","Type":"ContainerDied","Data":"d11c0bccb56b93177166eac79409d153427d2b76afb3ca0f0108f599b821c754"} Jan 21 10:00:24 crc kubenswrapper[4618]: I0121 10:00:24.086430 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" event={"ID":"b7dcc8f6-cabd-46ca-a229-47e740a59439","Type":"ContainerStarted","Data":"cc406d0111caa653cc9434789fa72efe245b72e2db7fbdefccbe252fb54d6db2"} Jan 21 10:00:24 crc kubenswrapper[4618]: I0121 10:00:24.117443 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-mf2md"] Jan 21 10:00:24 crc kubenswrapper[4618]: I0121 10:00:24.123090 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pm2m4/crc-debug-mf2md"] Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.182886 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.262540 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqpvh\" (UniqueName: \"kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh\") pod \"b7dcc8f6-cabd-46ca-a229-47e740a59439\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.262677 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host\") pod \"b7dcc8f6-cabd-46ca-a229-47e740a59439\" (UID: \"b7dcc8f6-cabd-46ca-a229-47e740a59439\") " Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.262803 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host" (OuterVolumeSpecName: "host") pod "b7dcc8f6-cabd-46ca-a229-47e740a59439" (UID: "b7dcc8f6-cabd-46ca-a229-47e740a59439"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.263236 4618 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7dcc8f6-cabd-46ca-a229-47e740a59439-host\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.269175 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh" (OuterVolumeSpecName: "kube-api-access-jqpvh") pod "b7dcc8f6-cabd-46ca-a229-47e740a59439" (UID: "b7dcc8f6-cabd-46ca-a229-47e740a59439"). InnerVolumeSpecName "kube-api-access-jqpvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.366820 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqpvh\" (UniqueName: \"kubernetes.io/projected/b7dcc8f6-cabd-46ca-a229-47e740a59439-kube-api-access-jqpvh\") on node \"crc\" DevicePath \"\"" Jan 21 10:00:25 crc kubenswrapper[4618]: I0121 10:00:25.551011 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7dcc8f6-cabd-46ca-a229-47e740a59439" path="/var/lib/kubelet/pods/b7dcc8f6-cabd-46ca-a229-47e740a59439/volumes" Jan 21 10:00:26 crc kubenswrapper[4618]: I0121 10:00:26.107290 4618 scope.go:117] "RemoveContainer" containerID="d11c0bccb56b93177166eac79409d153427d2b76afb3ca0f0108f599b821c754" Jan 21 10:00:26 crc kubenswrapper[4618]: I0121 10:00:26.107323 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/crc-debug-mf2md" Jan 21 10:00:27 crc kubenswrapper[4618]: I0121 10:00:27.538463 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 10:00:28 crc kubenswrapper[4618]: I0121 10:00:28.138550 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27"} Jan 21 10:00:28 crc kubenswrapper[4618]: I0121 10:00:28.655584 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 10:00:28 crc kubenswrapper[4618]: I0121 10:00:28.667451 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 10:00:28 crc kubenswrapper[4618]: I0121 10:00:28.674943 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 10:00:29 crc kubenswrapper[4618]: I0121 10:00:29.753787 4618 scope.go:117] "RemoveContainer" containerID="a0708cf98e07ec1dfbb2355dd82000acfef18763c3a1abdf9efa7a06d2161622" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.868928 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-sqxmv_d7fc037d-6b85-473a-bd03-3a266430e4e2/nmstate-console-plugin/0.log" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.890643 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fdzmd_a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2/nmstate-handler/0.log" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.901017 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/nmstate-metrics/0.log" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.910478 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/kube-rbac-proxy/0.log" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.925562 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-dcjhc_80022532-8c85-41c8-8c65-a67f28411a13/nmstate-operator/0.log" Jan 21 10:00:32 crc kubenswrapper[4618]: I0121 10:00:32.933560 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lrckd_71e9ce01-3713-4cf6-a76e-ad21ac16e10e/nmstate-webhook/0.log" Jan 21 10:00:42 crc kubenswrapper[4618]: I0121 10:00:42.022045 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 10:00:42 crc kubenswrapper[4618]: I0121 10:00:42.029214 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 10:00:42 crc kubenswrapper[4618]: I0121 10:00:42.043822 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.154529 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.165259 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.169873 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.175067 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.181276 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.187558 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.192912 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.199093 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.207088 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.228380 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.240704 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.573789 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 10:00:43 crc kubenswrapper[4618]: I0121 10:00:43.585541 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.789841 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/extract/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.800258 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/util/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.810001 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc5d82_56473d23-b169-4791-a419-71d0ddf89139/pull/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.819573 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/extract/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.826907 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/util/0.log" Jan 21 10:00:46 crc kubenswrapper[4618]: I0121 10:00:46.834075 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71378jwr_ed2112f2-a30b-4d37-b299-7fb4e5f0d4bd/pull/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.217335 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/registry-server/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.222375 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/extract-utilities/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.228955 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xcsv8_38d4879c-3ab9-4282-9d58-263cfb585759/extract-content/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.615937 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/registry-server/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.620225 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/extract-utilities/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.625504 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9svvz_4e29e499-2283-4105-bcf5-73ae74791ce6/extract-content/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.636134 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4mpc9_9eb45d53-b317-4346-9a4e-679ff4473d3d/marketplace-operator/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.739791 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/registry-server/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.744051 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/extract-utilities/0.log" Jan 21 10:00:47 crc kubenswrapper[4618]: I0121 10:00:47.749399 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-69lfl_4e8fdca5-6e8d-4e22-81b7-8dd92526c0f5/extract-content/0.log" Jan 21 10:00:48 crc kubenswrapper[4618]: I0121 10:00:48.178326 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/registry-server/0.log" Jan 21 10:00:48 crc kubenswrapper[4618]: I0121 10:00:48.183752 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/extract-utilities/0.log" Jan 21 10:00:48 crc kubenswrapper[4618]: I0121 10:00:48.192088 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xdxtq_bc646d50-9435-404e-9b80-42ad016be4f9/extract-content/0.log" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.137779 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29483161-kz7cb"] Jan 21 10:01:00 crc kubenswrapper[4618]: E0121 10:01:00.138833 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dcc8f6-cabd-46ca-a229-47e740a59439" containerName="container-00" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.138849 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dcc8f6-cabd-46ca-a229-47e740a59439" containerName="container-00" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.139050 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7dcc8f6-cabd-46ca-a229-47e740a59439" containerName="container-00" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.139729 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.150044 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483161-kz7cb"] Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.174787 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.174906 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.175007 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httrl\" (UniqueName: \"kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.175097 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.276778 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.276857 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httrl\" (UniqueName: \"kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.276906 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.276972 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.286707 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.286839 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.286857 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.292861 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httrl\" (UniqueName: \"kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl\") pod \"keystone-cron-29483161-kz7cb\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.456368 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:00 crc kubenswrapper[4618]: I0121 10:01:00.862099 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483161-kz7cb"] Jan 21 10:01:01 crc kubenswrapper[4618]: I0121 10:01:01.391119 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483161-kz7cb" event={"ID":"d410d8eb-bf9f-458d-979b-f8fc96836332","Type":"ContainerStarted","Data":"6946c449d6ff20da01e953ac794f9a37012f6dda9200e21b7e1e68d789e43d51"} Jan 21 10:01:01 crc kubenswrapper[4618]: I0121 10:01:01.391421 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483161-kz7cb" event={"ID":"d410d8eb-bf9f-458d-979b-f8fc96836332","Type":"ContainerStarted","Data":"9fb26b26a54d7add6c3a9fabf7a737c2a61d66ea7380fcf8fed6f271bc1dc672"} Jan 21 10:01:01 crc kubenswrapper[4618]: I0121 10:01:01.407028 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29483161-kz7cb" podStartSLOduration=1.407006268 podStartE2EDuration="1.407006268s" podCreationTimestamp="2026-01-21 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 10:01:01.402790793 +0000 UTC m=+3460.153258110" watchObservedRunningTime="2026-01-21 10:01:01.407006268 +0000 UTC m=+3460.157473585" Jan 21 10:01:03 crc kubenswrapper[4618]: I0121 10:01:03.406203 4618 generic.go:334] "Generic (PLEG): container finished" podID="d410d8eb-bf9f-458d-979b-f8fc96836332" containerID="6946c449d6ff20da01e953ac794f9a37012f6dda9200e21b7e1e68d789e43d51" exitCode=0 Jan 21 10:01:03 crc kubenswrapper[4618]: I0121 10:01:03.406266 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483161-kz7cb" event={"ID":"d410d8eb-bf9f-458d-979b-f8fc96836332","Type":"ContainerDied","Data":"6946c449d6ff20da01e953ac794f9a37012f6dda9200e21b7e1e68d789e43d51"} Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.675261 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.773642 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys\") pod \"d410d8eb-bf9f-458d-979b-f8fc96836332\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.773729 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-httrl\" (UniqueName: \"kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl\") pod \"d410d8eb-bf9f-458d-979b-f8fc96836332\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.773768 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle\") pod \"d410d8eb-bf9f-458d-979b-f8fc96836332\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.773847 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data\") pod \"d410d8eb-bf9f-458d-979b-f8fc96836332\" (UID: \"d410d8eb-bf9f-458d-979b-f8fc96836332\") " Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.783218 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d410d8eb-bf9f-458d-979b-f8fc96836332" (UID: "d410d8eb-bf9f-458d-979b-f8fc96836332"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.784777 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl" (OuterVolumeSpecName: "kube-api-access-httrl") pod "d410d8eb-bf9f-458d-979b-f8fc96836332" (UID: "d410d8eb-bf9f-458d-979b-f8fc96836332"). InnerVolumeSpecName "kube-api-access-httrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.806293 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d410d8eb-bf9f-458d-979b-f8fc96836332" (UID: "d410d8eb-bf9f-458d-979b-f8fc96836332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.822316 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data" (OuterVolumeSpecName: "config-data") pod "d410d8eb-bf9f-458d-979b-f8fc96836332" (UID: "d410d8eb-bf9f-458d-979b-f8fc96836332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.875909 4618 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.875937 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-httrl\" (UniqueName: \"kubernetes.io/projected/d410d8eb-bf9f-458d-979b-f8fc96836332-kube-api-access-httrl\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.875949 4618 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:04 crc kubenswrapper[4618]: I0121 10:01:04.875959 4618 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d410d8eb-bf9f-458d-979b-f8fc96836332-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:05 crc kubenswrapper[4618]: I0121 10:01:05.424005 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483161-kz7cb" event={"ID":"d410d8eb-bf9f-458d-979b-f8fc96836332","Type":"ContainerDied","Data":"9fb26b26a54d7add6c3a9fabf7a737c2a61d66ea7380fcf8fed6f271bc1dc672"} Jan 21 10:01:05 crc kubenswrapper[4618]: I0121 10:01:05.424214 4618 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fb26b26a54d7add6c3a9fabf7a737c2a61d66ea7380fcf8fed6f271bc1dc672" Jan 21 10:01:05 crc kubenswrapper[4618]: I0121 10:01:05.424068 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483161-kz7cb" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.606243 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:12 crc kubenswrapper[4618]: E0121 10:01:12.607088 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d410d8eb-bf9f-458d-979b-f8fc96836332" containerName="keystone-cron" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.607102 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="d410d8eb-bf9f-458d-979b-f8fc96836332" containerName="keystone-cron" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.607296 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="d410d8eb-bf9f-458d-979b-f8fc96836332" containerName="keystone-cron" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.608630 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.617162 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.737044 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrz6\" (UniqueName: \"kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.737608 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.737652 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.839382 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrz6\" (UniqueName: \"kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.839462 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.839501 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.839887 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.840014 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.866065 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrz6\" (UniqueName: \"kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6\") pod \"redhat-marketplace-8w85q\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:12 crc kubenswrapper[4618]: I0121 10:01:12.924088 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:13 crc kubenswrapper[4618]: I0121 10:01:13.380907 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:13 crc kubenswrapper[4618]: I0121 10:01:13.484591 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerStarted","Data":"54342072fdebf3cd2866c57b46bb6a83c9468691f8b7992373c51657ed8c834b"} Jan 21 10:01:14 crc kubenswrapper[4618]: I0121 10:01:14.493082 4618 generic.go:334] "Generic (PLEG): container finished" podID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerID="dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff" exitCode=0 Jan 21 10:01:14 crc kubenswrapper[4618]: I0121 10:01:14.493184 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerDied","Data":"dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff"} Jan 21 10:01:15 crc kubenswrapper[4618]: I0121 10:01:15.503135 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerStarted","Data":"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c"} Jan 21 10:01:16 crc kubenswrapper[4618]: I0121 10:01:16.512065 4618 generic.go:334] "Generic (PLEG): container finished" podID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerID="680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c" exitCode=0 Jan 21 10:01:16 crc kubenswrapper[4618]: I0121 10:01:16.512186 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerDied","Data":"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c"} Jan 21 10:01:17 crc kubenswrapper[4618]: I0121 10:01:17.521543 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerStarted","Data":"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf"} Jan 21 10:01:17 crc kubenswrapper[4618]: I0121 10:01:17.539027 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8w85q" podStartSLOduration=3.046726352 podStartE2EDuration="5.539011425s" podCreationTimestamp="2026-01-21 10:01:12 +0000 UTC" firstStartedPulling="2026-01-21 10:01:14.494942858 +0000 UTC m=+3473.245410175" lastFinishedPulling="2026-01-21 10:01:16.98722793 +0000 UTC m=+3475.737695248" observedRunningTime="2026-01-21 10:01:17.535661698 +0000 UTC m=+3476.286129015" watchObservedRunningTime="2026-01-21 10:01:17.539011425 +0000 UTC m=+3476.289478742" Jan 21 10:01:22 crc kubenswrapper[4618]: I0121 10:01:22.925331 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:22 crc kubenswrapper[4618]: I0121 10:01:22.925847 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:22 crc kubenswrapper[4618]: I0121 10:01:22.970537 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:23 crc kubenswrapper[4618]: I0121 10:01:23.617969 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:23 crc kubenswrapper[4618]: I0121 10:01:23.667315 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:25 crc kubenswrapper[4618]: I0121 10:01:25.595173 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8w85q" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="registry-server" containerID="cri-o://896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf" gracePeriod=2 Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.024700 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.208222 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities\") pod \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.208453 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content\") pod \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.208693 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvrz6\" (UniqueName: \"kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6\") pod \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\" (UID: \"5c94c0af-ab47-4949-9a41-b4d82f7061fc\") " Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.209051 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities" (OuterVolumeSpecName: "utilities") pod "5c94c0af-ab47-4949-9a41-b4d82f7061fc" (UID: "5c94c0af-ab47-4949-9a41-b4d82f7061fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.210233 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.221308 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6" (OuterVolumeSpecName: "kube-api-access-pvrz6") pod "5c94c0af-ab47-4949-9a41-b4d82f7061fc" (UID: "5c94c0af-ab47-4949-9a41-b4d82f7061fc"). InnerVolumeSpecName "kube-api-access-pvrz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.234956 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c94c0af-ab47-4949-9a41-b4d82f7061fc" (UID: "5c94c0af-ab47-4949-9a41-b4d82f7061fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.312053 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c94c0af-ab47-4949-9a41-b4d82f7061fc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.312079 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvrz6\" (UniqueName: \"kubernetes.io/projected/5c94c0af-ab47-4949-9a41-b4d82f7061fc-kube-api-access-pvrz6\") on node \"crc\" DevicePath \"\"" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.605642 4618 generic.go:334] "Generic (PLEG): container finished" podID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerID="896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf" exitCode=0 Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.605694 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerDied","Data":"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf"} Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.605707 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w85q" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.605729 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w85q" event={"ID":"5c94c0af-ab47-4949-9a41-b4d82f7061fc","Type":"ContainerDied","Data":"54342072fdebf3cd2866c57b46bb6a83c9468691f8b7992373c51657ed8c834b"} Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.605758 4618 scope.go:117] "RemoveContainer" containerID="896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.628302 4618 scope.go:117] "RemoveContainer" containerID="680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.639452 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.650272 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w85q"] Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.660785 4618 scope.go:117] "RemoveContainer" containerID="dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.685780 4618 scope.go:117] "RemoveContainer" containerID="896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf" Jan 21 10:01:26 crc kubenswrapper[4618]: E0121 10:01:26.686239 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf\": container with ID starting with 896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf not found: ID does not exist" containerID="896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.686281 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf"} err="failed to get container status \"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf\": rpc error: code = NotFound desc = could not find container \"896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf\": container with ID starting with 896f4366857db4c0401e1940a09b2b76794cfe525b33b65bfb990cd69d2fd8cf not found: ID does not exist" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.686309 4618 scope.go:117] "RemoveContainer" containerID="680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c" Jan 21 10:01:26 crc kubenswrapper[4618]: E0121 10:01:26.686603 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c\": container with ID starting with 680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c not found: ID does not exist" containerID="680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.686627 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c"} err="failed to get container status \"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c\": rpc error: code = NotFound desc = could not find container \"680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c\": container with ID starting with 680a02e4ac346f79e130aed05b6e8e1a649212310b5d3f6e051b6d6436a7cc4c not found: ID does not exist" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.686639 4618 scope.go:117] "RemoveContainer" containerID="dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff" Jan 21 10:01:26 crc kubenswrapper[4618]: E0121 10:01:26.686915 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff\": container with ID starting with dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff not found: ID does not exist" containerID="dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff" Jan 21 10:01:26 crc kubenswrapper[4618]: I0121 10:01:26.687018 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff"} err="failed to get container status \"dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff\": rpc error: code = NotFound desc = could not find container \"dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff\": container with ID starting with dfa0c0d66f530bb2db202113676ea08d1a6e2b9c9cc8abf7651b15f97c0491ff not found: ID does not exist" Jan 21 10:01:27 crc kubenswrapper[4618]: I0121 10:01:27.545970 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" path="/var/lib/kubelet/pods/5c94c0af-ab47-4949-9a41-b4d82f7061fc/volumes" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.455509 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/controller/0.log" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.460915 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gn7q5_3754650d-5a51-4b01-98e7-2575b5212346/kube-rbac-proxy/0.log" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.487379 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/controller/0.log" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.510009 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.525292 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 10:01:43 crc kubenswrapper[4618]: I0121 10:01:43.537365 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.330033 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.337562 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.343630 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.408687 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.455653 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.474924 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.618982 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.637840 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.669206 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.797833 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.808240 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/reloader/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.812522 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/frr-metrics/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.822876 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.829890 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/kube-rbac-proxy-frr/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.833522 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-frr-files/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.839783 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-reloader/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.846468 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fljjn_305963d0-7d19-440d-ba24-c836947123ab/cp-metrics/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.856821 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2l8f6_0b1f4460-bb9d-4f03-a4bd-57e0a5f79669/frr-k8s-webhook-server/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.877758 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-656ff8bd-4klk8_4b0325f8-aa62-451f-84b7-9f393225ff9d/manager/0.log" Jan 21 10:01:44 crc kubenswrapper[4618]: I0121 10:01:44.885540 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8485b999df-6fwkm_ecb8ccb1-678b-4dd5-be5e-8296b9305053/webhook-server/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.027921 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.039792 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.114744 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.127306 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.169267 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.232685 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.326407 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/speaker/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.334366 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxvc2_5acf067e-b50e-4176-8d97-18188382659a/kube-rbac-proxy/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.337975 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.355748 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.370462 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 10:01:45 crc kubenswrapper[4618]: I0121 10:01:45.556819 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.166219 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vv9rr_d9674a2f-8cdc-4165-b8e0-9cfc0914d17f/cert-manager-controller/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.182869 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j6lvm_a23d36e0-6e5d-4cc6-a21c-9d6a114e7158/cert-manager-cainjector/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.190956 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-q6frw_d736c899-0a94-4fb8-9e97-077345f1a8b7/cert-manager-webhook/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.682284 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.740190 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.791004 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.797125 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9t8g5_c73d2ba9-7e84-4b30-a2d2-66da4cdcfd3f/control-plane-machine-set-operator/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.812672 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.813529 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/kube-rbac-proxy/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.820633 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kt5l4_b52b45bc-5ace-4daa-8548-030f576ece0f/machine-api-operator/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.831910 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.850864 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.901583 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.910243 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 10:01:46 crc kubenswrapper[4618]: I0121 10:01:46.921368 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.436074 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/extract/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.443510 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/util/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.451053 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2e2nhww_1d0a2799-66d1-4406-b3da-33db634ae051/pull/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.552364 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-6j9f2_982d4204-447a-43c3-858e-c16cceebf1bb/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.589249 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6zn64_d1aecea0-7bc5-48c6-8edc-c7d447f7b7f4/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.601376 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-nf54z_f3975776-d0c3-478c-873c-349415bf2d3c/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.702357 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4r2qm_e0011800-e28a-4e71-8306-819d8d865dfe/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.712900 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-ms7zc_276f144f-a185-46da-a3af-f0aa8a9eaaad/manager/0.log" Jan 21 10:01:47 crc kubenswrapper[4618]: I0121 10:01:47.741173 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-bd65l_0ff11d9c-92c7-4b78-8336-70e117f63880/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.036633 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-dsjzx_cad4873a-5a2e-40ea-a4b1-3173e8138be0/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.049714 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-g58rl_80cee31f-467d-4c99-8b58-1edbee74f4a9/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.113260 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-l55q5_69396ad4-b4ad-4f43-a0f5-83b655e590da/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.123513 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-djc75_61c3771f-ea2c-4307-8d5b-7f44194235cd/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.146622 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-sqxmv_d7fc037d-6b85-473a-bd03-3a266430e4e2/nmstate-console-plugin/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.154356 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-lsgpp_0ec13d1d-fae7-4efd-92d6-0b93f972694f/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.158168 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fdzmd_a5eab1fc-fb36-4437-bf98-e0a3fcea7fc2/nmstate-handler/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.167343 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/nmstate-metrics/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.174913 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-8r4qk_822b5ec2-ecb3-459a-8445-6722cc28e866/kube-rbac-proxy/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.188995 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-dcjhc_80022532-8c85-41c8-8c65-a67f28411a13/nmstate-operator/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.196645 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lrckd_71e9ce01-3713-4cf6-a76e-ad21ac16e10e/nmstate-webhook/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.216635 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-5m9wn_f0bde946-f6c9-45a5-a124-6cf62551f0bc/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.303198 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-j5xjz_14908c8c-b444-4359-9e3a-e0fcc443e9f7/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.315028 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-cmhx4_1739988f-1de9-4c68-85ac-c14971105314/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.331073 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b9875986dvc9c5_b662a5ae-39f6-4592-baf2-efa15f7c82b0/manager/0.log" Jan 21 10:01:48 crc kubenswrapper[4618]: I0121 10:01:48.494378 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6d4d7d8545-hbl4s_049e7414-823b-45cc-92e6-da0652157046/operator/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.742308 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-75bfd788c8-42lr9_cfa3b66e-c251-46f7-ade1-edd4df56db67/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.774366 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6m77l_fa1a4914-7994-4004-b3aa-b3bbf62ed6df/registry-server/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.838489 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-7nkmc_b3629416-c45e-46da-98ba-dfd8b6630abd/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.859330 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-r895x_1f7120e5-8e39-4664-9d63-beaea1ff4043/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.878699 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-9nmj5_1bab5bac-6dfb-48f0-bf21-71dbfb2d3653/operator/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.908254 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-zgrxl_5af2019b-e469-403f-8c3e-91006f2902ad/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.969513 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-778qv_16d3b481-106a-48ee-b99c-7a380086a9cd/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.979877 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-g4khd_e4f5bddf-5e04-4510-903b-6861f19fa87b/manager/0.log" Jan 21 10:01:49 crc kubenswrapper[4618]: I0121 10:01:49.989770 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-czzg6_010792a0-26fd-456a-9186-79799c9a511e/manager/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.365092 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/kube-multus-additional-cni-plugins/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.372301 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/egress-router-binary-copy/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.382377 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/cni-plugins/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.388848 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/bond-cni-plugin/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.394473 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/routeoverride-cni/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.400834 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/whereabouts-cni-bincopy/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.407035 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-24dd7_32082919-a07c-414d-b784-1ad042460385/whereabouts-cni/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.437881 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-dp9f8_fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b/multus-admission-controller/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.442889 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-dp9f8_fbd15d3f-98ac-438c-90d3-2a07fd6ffa1b/kube-rbac-proxy/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.491873 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/2.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.580482 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m6jz5_052a66c4-94ce-4336-93f6-1d0023e58cc4/kube-multus/3.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.613921 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpxzc_d164c95c-cb58-47e7-a3a3-7e7bce8b9743/network-metrics-daemon/0.log" Jan 21 10:01:51 crc kubenswrapper[4618]: I0121 10:01:51.619908 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpxzc_d164c95c-cb58-47e7-a3a3-7e7bce8b9743/kube-rbac-proxy/0.log" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.124613 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:32 crc kubenswrapper[4618]: E0121 10:02:32.126979 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="extract-content" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.127011 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="extract-content" Jan 21 10:02:32 crc kubenswrapper[4618]: E0121 10:02:32.127023 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="extract-utilities" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.127029 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="extract-utilities" Jan 21 10:02:32 crc kubenswrapper[4618]: E0121 10:02:32.127054 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="registry-server" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.127062 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="registry-server" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.127578 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c94c0af-ab47-4949-9a41-b4d82f7061fc" containerName="registry-server" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.129799 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.158041 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.210876 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.211033 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.211345 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5wj\" (UniqueName: \"kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.313409 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.313508 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5wj\" (UniqueName: \"kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.313595 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.313865 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.314007 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.342123 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5wj\" (UniqueName: \"kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj\") pod \"community-operators-rswz2\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.452154 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:32 crc kubenswrapper[4618]: I0121 10:02:32.926354 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:33 crc kubenswrapper[4618]: I0121 10:02:33.184667 4618 generic.go:334] "Generic (PLEG): container finished" podID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerID="bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5" exitCode=0 Jan 21 10:02:33 crc kubenswrapper[4618]: I0121 10:02:33.184725 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerDied","Data":"bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5"} Jan 21 10:02:33 crc kubenswrapper[4618]: I0121 10:02:33.184771 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerStarted","Data":"b3aa67a33e0b7efdc4a0ac470769dacc6fb0589eede42122bf050c6e13ca885b"} Jan 21 10:02:34 crc kubenswrapper[4618]: I0121 10:02:34.193724 4618 generic.go:334] "Generic (PLEG): container finished" podID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerID="58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256" exitCode=0 Jan 21 10:02:34 crc kubenswrapper[4618]: I0121 10:02:34.193911 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerDied","Data":"58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256"} Jan 21 10:02:34 crc kubenswrapper[4618]: I0121 10:02:34.196162 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 10:02:35 crc kubenswrapper[4618]: I0121 10:02:35.230606 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerStarted","Data":"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8"} Jan 21 10:02:35 crc kubenswrapper[4618]: I0121 10:02:35.258608 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rswz2" podStartSLOduration=1.757435418 podStartE2EDuration="3.258590505s" podCreationTimestamp="2026-01-21 10:02:32 +0000 UTC" firstStartedPulling="2026-01-21 10:02:33.18671363 +0000 UTC m=+3551.937180947" lastFinishedPulling="2026-01-21 10:02:34.687868717 +0000 UTC m=+3553.438336034" observedRunningTime="2026-01-21 10:02:35.251072141 +0000 UTC m=+3554.001539458" watchObservedRunningTime="2026-01-21 10:02:35.258590505 +0000 UTC m=+3554.009057822" Jan 21 10:02:42 crc kubenswrapper[4618]: I0121 10:02:42.452874 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:42 crc kubenswrapper[4618]: I0121 10:02:42.453539 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:42 crc kubenswrapper[4618]: I0121 10:02:42.489632 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:43 crc kubenswrapper[4618]: I0121 10:02:43.349440 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:43 crc kubenswrapper[4618]: I0121 10:02:43.384278 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.332424 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rswz2" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="registry-server" containerID="cri-o://546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8" gracePeriod=2 Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.724500 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.774262 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content\") pod \"b45b7bd3-7686-4815-85e4-a0455c532d81\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.774319 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities\") pod \"b45b7bd3-7686-4815-85e4-a0455c532d81\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.774472 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h5wj\" (UniqueName: \"kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj\") pod \"b45b7bd3-7686-4815-85e4-a0455c532d81\" (UID: \"b45b7bd3-7686-4815-85e4-a0455c532d81\") " Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.775221 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities" (OuterVolumeSpecName: "utilities") pod "b45b7bd3-7686-4815-85e4-a0455c532d81" (UID: "b45b7bd3-7686-4815-85e4-a0455c532d81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.781432 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj" (OuterVolumeSpecName: "kube-api-access-9h5wj") pod "b45b7bd3-7686-4815-85e4-a0455c532d81" (UID: "b45b7bd3-7686-4815-85e4-a0455c532d81"). InnerVolumeSpecName "kube-api-access-9h5wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.820388 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b45b7bd3-7686-4815-85e4-a0455c532d81" (UID: "b45b7bd3-7686-4815-85e4-a0455c532d81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.876357 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h5wj\" (UniqueName: \"kubernetes.io/projected/b45b7bd3-7686-4815-85e4-a0455c532d81-kube-api-access-9h5wj\") on node \"crc\" DevicePath \"\"" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.876405 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 10:02:45 crc kubenswrapper[4618]: I0121 10:02:45.876416 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45b7bd3-7686-4815-85e4-a0455c532d81-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.343819 4618 generic.go:334] "Generic (PLEG): container finished" podID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerID="546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8" exitCode=0 Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.343873 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerDied","Data":"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8"} Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.343915 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rswz2" event={"ID":"b45b7bd3-7686-4815-85e4-a0455c532d81","Type":"ContainerDied","Data":"b3aa67a33e0b7efdc4a0ac470769dacc6fb0589eede42122bf050c6e13ca885b"} Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.343922 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rswz2" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.343940 4618 scope.go:117] "RemoveContainer" containerID="546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.360787 4618 scope.go:117] "RemoveContainer" containerID="58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.378745 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.386229 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rswz2"] Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.405171 4618 scope.go:117] "RemoveContainer" containerID="bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.423431 4618 scope.go:117] "RemoveContainer" containerID="546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8" Jan 21 10:02:46 crc kubenswrapper[4618]: E0121 10:02:46.423879 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8\": container with ID starting with 546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8 not found: ID does not exist" containerID="546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.423911 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8"} err="failed to get container status \"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8\": rpc error: code = NotFound desc = could not find container \"546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8\": container with ID starting with 546edc0f280f8c329b6caca9ce97b8240d60fdb3a79429b2b490400ed52004c8 not found: ID does not exist" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.423938 4618 scope.go:117] "RemoveContainer" containerID="58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256" Jan 21 10:02:46 crc kubenswrapper[4618]: E0121 10:02:46.424242 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256\": container with ID starting with 58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256 not found: ID does not exist" containerID="58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.424280 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256"} err="failed to get container status \"58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256\": rpc error: code = NotFound desc = could not find container \"58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256\": container with ID starting with 58e4af40f9f815d80c7c1a11558262ea67708147277ed49d9efe4f8eddf10256 not found: ID does not exist" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.424298 4618 scope.go:117] "RemoveContainer" containerID="bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5" Jan 21 10:02:46 crc kubenswrapper[4618]: E0121 10:02:46.424561 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5\": container with ID starting with bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5 not found: ID does not exist" containerID="bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5" Jan 21 10:02:46 crc kubenswrapper[4618]: I0121 10:02:46.424587 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5"} err="failed to get container status \"bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5\": rpc error: code = NotFound desc = could not find container \"bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5\": container with ID starting with bc22fea433bad6af4270310dd609f3b9dd1df45ae83ae7f8b2919ad19948c7e5 not found: ID does not exist" Jan 21 10:02:47 crc kubenswrapper[4618]: I0121 10:02:47.550287 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" path="/var/lib/kubelet/pods/b45b7bd3-7686-4815-85e4-a0455c532d81/volumes" Jan 21 10:02:56 crc kubenswrapper[4618]: I0121 10:02:56.959114 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:02:56 crc kubenswrapper[4618]: I0121 10:02:56.959681 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:03:26 crc kubenswrapper[4618]: I0121 10:03:26.958877 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:03:26 crc kubenswrapper[4618]: I0121 10:03:26.959628 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:03:56 crc kubenswrapper[4618]: I0121 10:03:56.959544 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:03:56 crc kubenswrapper[4618]: I0121 10:03:56.960359 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:03:56 crc kubenswrapper[4618]: I0121 10:03:56.960419 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 10:03:56 crc kubenswrapper[4618]: I0121 10:03:56.962073 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 10:03:56 crc kubenswrapper[4618]: I0121 10:03:56.962220 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27" gracePeriod=600 Jan 21 10:03:57 crc kubenswrapper[4618]: I0121 10:03:57.943486 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27" exitCode=0 Jan 21 10:03:57 crc kubenswrapper[4618]: I0121 10:03:57.943987 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27"} Jan 21 10:03:57 crc kubenswrapper[4618]: I0121 10:03:57.944018 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerStarted","Data":"a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12"} Jan 21 10:03:57 crc kubenswrapper[4618]: I0121 10:03:57.944053 4618 scope.go:117] "RemoveContainer" containerID="271b816461aa310b392a79b3116355263c087c94ad76db772f5fda54aae24251" Jan 21 10:06:26 crc kubenswrapper[4618]: I0121 10:06:26.959640 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:06:26 crc kubenswrapper[4618]: I0121 10:06:26.960236 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:06:56 crc kubenswrapper[4618]: I0121 10:06:56.958840 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:06:56 crc kubenswrapper[4618]: I0121 10:06:56.959560 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:07:26 crc kubenswrapper[4618]: I0121 10:07:26.959494 4618 patch_prober.go:28] interesting pod/machine-config-daemon-2bm47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 10:07:26 crc kubenswrapper[4618]: I0121 10:07:26.960325 4618 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 10:07:26 crc kubenswrapper[4618]: I0121 10:07:26.960395 4618 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" Jan 21 10:07:26 crc kubenswrapper[4618]: I0121 10:07:26.961412 4618 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12"} pod="openshift-machine-config-operator/machine-config-daemon-2bm47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 10:07:26 crc kubenswrapper[4618]: I0121 10:07:26.961483 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerName="machine-config-daemon" containerID="cri-o://a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" gracePeriod=600 Jan 21 10:07:27 crc kubenswrapper[4618]: E0121 10:07:27.086418 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:07:27 crc kubenswrapper[4618]: I0121 10:07:27.858476 4618 generic.go:334] "Generic (PLEG): container finished" podID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" exitCode=0 Jan 21 10:07:27 crc kubenswrapper[4618]: I0121 10:07:27.858526 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" event={"ID":"f819fb41-8eb7-4f8f-85f9-752aa5716cca","Type":"ContainerDied","Data":"a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12"} Jan 21 10:07:27 crc kubenswrapper[4618]: I0121 10:07:27.858612 4618 scope.go:117] "RemoveContainer" containerID="05cd93d30799b2bbbff31c63b9644ca2415625c1b9091d79a36612c5c0a8bf27" Jan 21 10:07:27 crc kubenswrapper[4618]: I0121 10:07:27.859240 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:07:27 crc kubenswrapper[4618]: E0121 10:07:27.859720 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:07:41 crc kubenswrapper[4618]: I0121 10:07:41.544260 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:07:41 crc kubenswrapper[4618]: E0121 10:07:41.545466 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:07:54 crc kubenswrapper[4618]: I0121 10:07:54.538369 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:07:54 crc kubenswrapper[4618]: E0121 10:07:54.539461 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:05 crc kubenswrapper[4618]: I0121 10:08:05.538300 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:08:05 crc kubenswrapper[4618]: E0121 10:08:05.539554 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:20 crc kubenswrapper[4618]: I0121 10:08:20.539100 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:08:20 crc kubenswrapper[4618]: E0121 10:08:20.540821 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:33 crc kubenswrapper[4618]: I0121 10:08:33.538265 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:08:33 crc kubenswrapper[4618]: E0121 10:08:33.539230 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:46 crc kubenswrapper[4618]: I0121 10:08:46.538780 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:08:46 crc kubenswrapper[4618]: E0121 10:08:46.539591 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:58 crc kubenswrapper[4618]: I0121 10:08:58.537906 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:08:58 crc kubenswrapper[4618]: E0121 10:08:58.539054 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:08:58 crc kubenswrapper[4618]: I0121 10:08:58.692914 4618 generic.go:334] "Generic (PLEG): container finished" podID="8f533448-f67c-46a4-bda8-f432fd43e484" containerID="0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3" exitCode=0 Jan 21 10:08:58 crc kubenswrapper[4618]: I0121 10:08:58.693014 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" event={"ID":"8f533448-f67c-46a4-bda8-f432fd43e484","Type":"ContainerDied","Data":"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3"} Jan 21 10:08:58 crc kubenswrapper[4618]: I0121 10:08:58.693933 4618 scope.go:117] "RemoveContainer" containerID="0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3" Jan 21 10:08:59 crc kubenswrapper[4618]: I0121 10:08:59.615439 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pm2m4_must-gather-mv5cc_8f533448-f67c-46a4-bda8-f432fd43e484/gather/0.log" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.172039 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pm2m4/must-gather-mv5cc"] Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.172893 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="copy" containerID="cri-o://e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805" gracePeriod=2 Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.185752 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pm2m4/must-gather-mv5cc"] Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.722270 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pm2m4_must-gather-mv5cc_8f533448-f67c-46a4-bda8-f432fd43e484/copy/0.log" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.723019 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.773422 4618 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pm2m4_must-gather-mv5cc_8f533448-f67c-46a4-bda8-f432fd43e484/copy/0.log" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.773773 4618 generic.go:334] "Generic (PLEG): container finished" podID="8f533448-f67c-46a4-bda8-f432fd43e484" containerID="e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805" exitCode=143 Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.773833 4618 scope.go:117] "RemoveContainer" containerID="e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.773847 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pm2m4/must-gather-mv5cc" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.790351 4618 scope.go:117] "RemoveContainer" containerID="0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.852702 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrgd9\" (UniqueName: \"kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9\") pod \"8f533448-f67c-46a4-bda8-f432fd43e484\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.852880 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output\") pod \"8f533448-f67c-46a4-bda8-f432fd43e484\" (UID: \"8f533448-f67c-46a4-bda8-f432fd43e484\") " Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.864456 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9" (OuterVolumeSpecName: "kube-api-access-jrgd9") pod "8f533448-f67c-46a4-bda8-f432fd43e484" (UID: "8f533448-f67c-46a4-bda8-f432fd43e484"). InnerVolumeSpecName "kube-api-access-jrgd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.872968 4618 scope.go:117] "RemoveContainer" containerID="e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805" Jan 21 10:09:07 crc kubenswrapper[4618]: E0121 10:09:07.873426 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805\": container with ID starting with e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805 not found: ID does not exist" containerID="e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.873472 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805"} err="failed to get container status \"e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805\": rpc error: code = NotFound desc = could not find container \"e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805\": container with ID starting with e478a44c203ac18b92e18f5ee960d853dd894166574de9e0d552bc4c759b7805 not found: ID does not exist" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.873507 4618 scope.go:117] "RemoveContainer" containerID="0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3" Jan 21 10:09:07 crc kubenswrapper[4618]: E0121 10:09:07.874954 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3\": container with ID starting with 0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3 not found: ID does not exist" containerID="0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.874992 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3"} err="failed to get container status \"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3\": rpc error: code = NotFound desc = could not find container \"0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3\": container with ID starting with 0b69fbf4615cf2afe9fc0b68da1d62788ed0b2935e2c4b430c96aee95d14e0b3 not found: ID does not exist" Jan 21 10:09:07 crc kubenswrapper[4618]: I0121 10:09:07.956740 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrgd9\" (UniqueName: \"kubernetes.io/projected/8f533448-f67c-46a4-bda8-f432fd43e484-kube-api-access-jrgd9\") on node \"crc\" DevicePath \"\"" Jan 21 10:09:08 crc kubenswrapper[4618]: I0121 10:09:08.043335 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8f533448-f67c-46a4-bda8-f432fd43e484" (UID: "8f533448-f67c-46a4-bda8-f432fd43e484"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:09:08 crc kubenswrapper[4618]: I0121 10:09:08.059477 4618 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f533448-f67c-46a4-bda8-f432fd43e484-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 21 10:09:09 crc kubenswrapper[4618]: I0121 10:09:09.551122 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" path="/var/lib/kubelet/pods/8f533448-f67c-46a4-bda8-f432fd43e484/volumes" Jan 21 10:09:12 crc kubenswrapper[4618]: I0121 10:09:12.538800 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:09:12 crc kubenswrapper[4618]: E0121 10:09:12.539516 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:09:26 crc kubenswrapper[4618]: I0121 10:09:26.550007 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:09:26 crc kubenswrapper[4618]: E0121 10:09:26.551525 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:09:41 crc kubenswrapper[4618]: I0121 10:09:41.543269 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:09:41 crc kubenswrapper[4618]: E0121 10:09:41.544963 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:09:52 crc kubenswrapper[4618]: I0121 10:09:52.538021 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:09:52 crc kubenswrapper[4618]: E0121 10:09:52.538984 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.377408 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:03 crc kubenswrapper[4618]: E0121 10:10:03.378347 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="gather" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378360 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="gather" Jan 21 10:10:03 crc kubenswrapper[4618]: E0121 10:10:03.378381 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="copy" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378387 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="copy" Jan 21 10:10:03 crc kubenswrapper[4618]: E0121 10:10:03.378401 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="extract-utilities" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378408 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="extract-utilities" Jan 21 10:10:03 crc kubenswrapper[4618]: E0121 10:10:03.378425 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="registry-server" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378430 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="registry-server" Jan 21 10:10:03 crc kubenswrapper[4618]: E0121 10:10:03.378446 4618 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="extract-content" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378451 4618 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="extract-content" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378624 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="copy" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378637 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f533448-f67c-46a4-bda8-f432fd43e484" containerName="gather" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.378647 4618 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45b7bd3-7686-4815-85e4-a0455c532d81" containerName="registry-server" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.380233 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.385184 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.439845 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.440025 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcm7\" (UniqueName: \"kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.440132 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.541829 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.541908 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcm7\" (UniqueName: \"kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.541944 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.542440 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.542477 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.558057 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcm7\" (UniqueName: \"kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7\") pod \"redhat-operators-nnk5z\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.698164 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.971096 4618 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.973321 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:03 crc kubenswrapper[4618]: I0121 10:10:03.975513 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.091938 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.152948 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.153126 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwbs\" (UniqueName: \"kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.153490 4618 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.255007 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwbs\" (UniqueName: \"kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.255132 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.255179 4618 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.255595 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.255668 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.260526 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerStarted","Data":"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760"} Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.260652 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerStarted","Data":"b715e7788c2d07fe301100cd15b6ec8d44747235f5a665f84ec6742df9afa74f"} Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.271284 4618 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwbs\" (UniqueName: \"kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs\") pod \"certified-operators-tjmlm\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.291712 4618 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.538283 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:10:04 crc kubenswrapper[4618]: E0121 10:10:04.538804 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:10:04 crc kubenswrapper[4618]: I0121 10:10:04.777527 4618 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:04 crc kubenswrapper[4618]: W0121 10:10:04.778185 4618 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf1e11c_58ac_41f8_b5d0_2f45a2b643cd.slice/crio-2805a11f180a38ddaf9a73fcc9a58ff48266aa34da22645f3cf32c91a2d9c033 WatchSource:0}: Error finding container 2805a11f180a38ddaf9a73fcc9a58ff48266aa34da22645f3cf32c91a2d9c033: Status 404 returned error can't find the container with id 2805a11f180a38ddaf9a73fcc9a58ff48266aa34da22645f3cf32c91a2d9c033 Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.269412 4618 generic.go:334] "Generic (PLEG): container finished" podID="2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" containerID="383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53" exitCode=0 Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.269502 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerDied","Data":"383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53"} Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.269547 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerStarted","Data":"2805a11f180a38ddaf9a73fcc9a58ff48266aa34da22645f3cf32c91a2d9c033"} Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.271238 4618 generic.go:334] "Generic (PLEG): container finished" podID="2d533810-a29f-4fa0-af27-7bc329148f03" containerID="cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760" exitCode=0 Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.271271 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerDied","Data":"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760"} Jan 21 10:10:05 crc kubenswrapper[4618]: I0121 10:10:05.271321 4618 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 10:10:06 crc kubenswrapper[4618]: I0121 10:10:06.282637 4618 generic.go:334] "Generic (PLEG): container finished" podID="2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" containerID="ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015" exitCode=0 Jan 21 10:10:06 crc kubenswrapper[4618]: I0121 10:10:06.282704 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerDied","Data":"ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015"} Jan 21 10:10:06 crc kubenswrapper[4618]: I0121 10:10:06.298423 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerStarted","Data":"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c"} Jan 21 10:10:06 crc kubenswrapper[4618]: E0121 10:10:06.736351 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d533810_a29f_4fa0_af27_7bc329148f03.slice/crio-e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c.scope\": RecentStats: unable to find data in memory cache]" Jan 21 10:10:07 crc kubenswrapper[4618]: I0121 10:10:07.309835 4618 generic.go:334] "Generic (PLEG): container finished" podID="2d533810-a29f-4fa0-af27-7bc329148f03" containerID="e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c" exitCode=0 Jan 21 10:10:07 crc kubenswrapper[4618]: I0121 10:10:07.309925 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerDied","Data":"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c"} Jan 21 10:10:07 crc kubenswrapper[4618]: I0121 10:10:07.312892 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerStarted","Data":"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe"} Jan 21 10:10:07 crc kubenswrapper[4618]: I0121 10:10:07.348340 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tjmlm" podStartSLOduration=2.850894264 podStartE2EDuration="4.3483186s" podCreationTimestamp="2026-01-21 10:10:03 +0000 UTC" firstStartedPulling="2026-01-21 10:10:05.271041232 +0000 UTC m=+4004.021508549" lastFinishedPulling="2026-01-21 10:10:06.768465568 +0000 UTC m=+4005.518932885" observedRunningTime="2026-01-21 10:10:07.34154181 +0000 UTC m=+4006.092009127" watchObservedRunningTime="2026-01-21 10:10:07.3483186 +0000 UTC m=+4006.098785917" Jan 21 10:10:08 crc kubenswrapper[4618]: I0121 10:10:08.323328 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerStarted","Data":"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65"} Jan 21 10:10:08 crc kubenswrapper[4618]: I0121 10:10:08.349069 4618 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnk5z" podStartSLOduration=2.818105184 podStartE2EDuration="5.349052579s" podCreationTimestamp="2026-01-21 10:10:03 +0000 UTC" firstStartedPulling="2026-01-21 10:10:05.272880108 +0000 UTC m=+4004.023347425" lastFinishedPulling="2026-01-21 10:10:07.803827502 +0000 UTC m=+4006.554294820" observedRunningTime="2026-01-21 10:10:08.339388151 +0000 UTC m=+4007.089855468" watchObservedRunningTime="2026-01-21 10:10:08.349052579 +0000 UTC m=+4007.099519895" Jan 21 10:10:13 crc kubenswrapper[4618]: I0121 10:10:13.698594 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:13 crc kubenswrapper[4618]: I0121 10:10:13.699370 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:13 crc kubenswrapper[4618]: I0121 10:10:13.737925 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:14 crc kubenswrapper[4618]: I0121 10:10:14.292191 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:14 crc kubenswrapper[4618]: I0121 10:10:14.292243 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:14 crc kubenswrapper[4618]: I0121 10:10:14.330486 4618 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:14 crc kubenswrapper[4618]: I0121 10:10:14.429445 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:14 crc kubenswrapper[4618]: I0121 10:10:14.429879 4618 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:15 crc kubenswrapper[4618]: I0121 10:10:15.393638 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.408698 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tjmlm" podUID="2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" containerName="registry-server" containerID="cri-o://adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe" gracePeriod=2 Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.779721 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.780183 4618 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnk5z" podUID="2d533810-a29f-4fa0-af27-7bc329148f03" containerName="registry-server" containerID="cri-o://c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65" gracePeriod=2 Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.793529 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:16 crc kubenswrapper[4618]: E0121 10:10:16.954094 4618 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d533810_a29f_4fa0_af27_7bc329148f03.slice/crio-c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65.scope\": RecentStats: unable to find data in memory cache]" Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.989394 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwbs\" (UniqueName: \"kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs\") pod \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.989614 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities\") pod \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.989647 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content\") pod \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\" (UID: \"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd\") " Jan 21 10:10:16 crc kubenswrapper[4618]: I0121 10:10:16.990945 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities" (OuterVolumeSpecName: "utilities") pod "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" (UID: "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.000536 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs" (OuterVolumeSpecName: "kube-api-access-8wwbs") pod "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" (UID: "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd"). InnerVolumeSpecName "kube-api-access-8wwbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.031900 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" (UID: "2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.091918 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.091955 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.091970 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwbs\" (UniqueName: \"kubernetes.io/projected/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd-kube-api-access-8wwbs\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.098800 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.294470 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcm7\" (UniqueName: \"kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7\") pod \"2d533810-a29f-4fa0-af27-7bc329148f03\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.294571 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities\") pod \"2d533810-a29f-4fa0-af27-7bc329148f03\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.294656 4618 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content\") pod \"2d533810-a29f-4fa0-af27-7bc329148f03\" (UID: \"2d533810-a29f-4fa0-af27-7bc329148f03\") " Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.295450 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities" (OuterVolumeSpecName: "utilities") pod "2d533810-a29f-4fa0-af27-7bc329148f03" (UID: "2d533810-a29f-4fa0-af27-7bc329148f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.299361 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7" (OuterVolumeSpecName: "kube-api-access-kqcm7") pod "2d533810-a29f-4fa0-af27-7bc329148f03" (UID: "2d533810-a29f-4fa0-af27-7bc329148f03"). InnerVolumeSpecName "kube-api-access-kqcm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.383344 4618 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d533810-a29f-4fa0-af27-7bc329148f03" (UID: "2d533810-a29f-4fa0-af27-7bc329148f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.397717 4618 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqcm7\" (UniqueName: \"kubernetes.io/projected/2d533810-a29f-4fa0-af27-7bc329148f03-kube-api-access-kqcm7\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.397765 4618 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.397778 4618 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d533810-a29f-4fa0-af27-7bc329148f03-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.419411 4618 generic.go:334] "Generic (PLEG): container finished" podID="2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" containerID="adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe" exitCode=0 Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.419483 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjmlm" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.419509 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerDied","Data":"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe"} Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.419552 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjmlm" event={"ID":"2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd","Type":"ContainerDied","Data":"2805a11f180a38ddaf9a73fcc9a58ff48266aa34da22645f3cf32c91a2d9c033"} Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.419575 4618 scope.go:117] "RemoveContainer" containerID="adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.422508 4618 generic.go:334] "Generic (PLEG): container finished" podID="2d533810-a29f-4fa0-af27-7bc329148f03" containerID="c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65" exitCode=0 Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.422567 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerDied","Data":"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65"} Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.422600 4618 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnk5z" event={"ID":"2d533810-a29f-4fa0-af27-7bc329148f03","Type":"ContainerDied","Data":"b715e7788c2d07fe301100cd15b6ec8d44747235f5a665f84ec6742df9afa74f"} Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.422614 4618 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnk5z" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.440063 4618 scope.go:117] "RemoveContainer" containerID="ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.451520 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.457375 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tjmlm"] Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.463197 4618 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.468434 4618 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnk5z"] Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.476737 4618 scope.go:117] "RemoveContainer" containerID="383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.497045 4618 scope.go:117] "RemoveContainer" containerID="adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.497790 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe\": container with ID starting with adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe not found: ID does not exist" containerID="adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.497832 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe"} err="failed to get container status \"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe\": rpc error: code = NotFound desc = could not find container \"adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe\": container with ID starting with adc21a96563b0b39c3ee0edd1479aa43f5defeb831709ae7dcddaa16e4e86bbe not found: ID does not exist" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.497867 4618 scope.go:117] "RemoveContainer" containerID="ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.498396 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015\": container with ID starting with ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015 not found: ID does not exist" containerID="ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.498456 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015"} err="failed to get container status \"ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015\": rpc error: code = NotFound desc = could not find container \"ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015\": container with ID starting with ed7225a46462adb6849242847d322208853c936554feaf05cebc562c73fd2015 not found: ID does not exist" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.498488 4618 scope.go:117] "RemoveContainer" containerID="383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.498816 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53\": container with ID starting with 383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53 not found: ID does not exist" containerID="383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.498850 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53"} err="failed to get container status \"383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53\": rpc error: code = NotFound desc = could not find container \"383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53\": container with ID starting with 383b859171e3f504a93a4ce4532dc69f9265cdf1da0e3fbd08dc9f746bedce53 not found: ID does not exist" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.498873 4618 scope.go:117] "RemoveContainer" containerID="c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.515153 4618 scope.go:117] "RemoveContainer" containerID="e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.547794 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd" path="/var/lib/kubelet/pods/2bf1e11c-58ac-41f8-b5d0-2f45a2b643cd/volumes" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.548484 4618 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d533810-a29f-4fa0-af27-7bc329148f03" path="/var/lib/kubelet/pods/2d533810-a29f-4fa0-af27-7bc329148f03/volumes" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.572765 4618 scope.go:117] "RemoveContainer" containerID="cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.599596 4618 scope.go:117] "RemoveContainer" containerID="c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.599953 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65\": container with ID starting with c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65 not found: ID does not exist" containerID="c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.599993 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65"} err="failed to get container status \"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65\": rpc error: code = NotFound desc = could not find container \"c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65\": container with ID starting with c1066bbbb402c7bee31b19fe91547ec50070ceb584e7cf4226b0682deaaf0e65 not found: ID does not exist" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.600020 4618 scope.go:117] "RemoveContainer" containerID="e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.600326 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c\": container with ID starting with e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c not found: ID does not exist" containerID="e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.600358 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c"} err="failed to get container status \"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c\": rpc error: code = NotFound desc = could not find container \"e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c\": container with ID starting with e7e87429f47b9da82c415b02fb0a85d34d859611a07027cca5044556921da97c not found: ID does not exist" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.600379 4618 scope.go:117] "RemoveContainer" containerID="cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760" Jan 21 10:10:17 crc kubenswrapper[4618]: E0121 10:10:17.600620 4618 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760\": container with ID starting with cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760 not found: ID does not exist" containerID="cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760" Jan 21 10:10:17 crc kubenswrapper[4618]: I0121 10:10:17.600706 4618 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760"} err="failed to get container status \"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760\": rpc error: code = NotFound desc = could not find container \"cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760\": container with ID starting with cd312794d1f9c5bd06e62b9020c4291a988e0ca740f913181f9292c7ecf9e760 not found: ID does not exist" Jan 21 10:10:18 crc kubenswrapper[4618]: I0121 10:10:18.538774 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:10:18 crc kubenswrapper[4618]: E0121 10:10:18.539368 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:10:33 crc kubenswrapper[4618]: I0121 10:10:33.539990 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:10:33 crc kubenswrapper[4618]: E0121 10:10:33.540881 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:10:44 crc kubenswrapper[4618]: I0121 10:10:44.537820 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:10:44 crc kubenswrapper[4618]: E0121 10:10:44.538702 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:10:57 crc kubenswrapper[4618]: I0121 10:10:57.538402 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:10:57 crc kubenswrapper[4618]: E0121 10:10:57.539589 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca" Jan 21 10:11:10 crc kubenswrapper[4618]: I0121 10:11:10.538097 4618 scope.go:117] "RemoveContainer" containerID="a845d7676c06191b4e870e4edd6f15e256592649fc17252caec86951f7957f12" Jan 21 10:11:10 crc kubenswrapper[4618]: E0121 10:11:10.538565 4618 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2bm47_openshift-machine-config-operator(f819fb41-8eb7-4f8f-85f9-752aa5716cca)\"" pod="openshift-machine-config-operator/machine-config-daemon-2bm47" podUID="f819fb41-8eb7-4f8f-85f9-752aa5716cca"